Inventors list

Assignees list

Classification tree browser

Top 100 Inventors

Top 100 Assignees


DISPLAY PERIPHERAL INTERFACE INPUT DEVICE

Subclass of:

345 - Computer graphics processing and selective visual display systems

Patent class list (only not empty are listed)

Deeper subclasses:

Class / Patent application numberDescriptionNumber of patent applications / Date published
345173000 Touch panel 8055
345157000 Cursor mark position control device 1891
345168000 Including keyboard 1131
345179000 Stylus 356
345184000 Mechanical control (e.g., rotatable knob, slider) 102
345180000 Light pen for CRT display 6
345183000 Light pen for controlling plural light-emitting display elements (e.g., LED, lamps) 3
20090295761LIGHT CONTROLLED SCREEN - A screen controlled by light includes a display and a resistor film cover on an outer surface of the display. The resistor film includes two separated transparent conducting layers. A plurality of photoresistors is coupled between the two transparent conducting layers, wherein the resistance value of each photoresistor is capable of varying under different light conditions.12-03-2009
20100156853DISPLAY APPARATUS AND PROGRAM - A display apparatus includes: a display surface that includes a plurality of pixels; a detection section that detects positions of the plurality of pixels successively designated on the display surface for each predetermined cycle; a rewriting section that rewrites colors of the pixels corresponding to the trace sequentially linking the positions detected by the detection section; a direction specifying section that, when the detection section detects the positions, specifies a direction of the trace at the corresponding positions; and a cycle changing section that, when the direction of the trace specified by the direction specifying section is changed and a magnitude of change of the corresponding direction increases, shortens the cycle in which the detection section detects the positions.06-24-2010
20080284757LIGHT POINTING DEVICE EMPLOYED IN INPUT APPARATUS, DRIVING METHOD AND INPUT APPARATUS USING THE SAME - A light pointing device employed in an input apparatus comprises a signal receiving interface and a light emitting component. The signal receiving interface is used to receive a system signal from the input apparatus. The light emitting component can generate a flashing light source whose intermittence cycle is synchronized with the reference clock cycle of the system signal Furthermore, the ratio of the intermittence cycle of the flashing light source to the reference clock cycle of the system signal is a natural number.11-20-2008
345182000 Light pen for fluid matrix display panel 3
20120113066OPTICAL PEN - The optical pen includes a transparent head, an optical source, an optical sensor, a first optical fiber module, a second bundled optical fiber, and a microcontroller. The first optical fiber module includes at least two first bundled optical fibers. Opposite ends of each first bundled optical fiber are respectively adjacent to the transparent head and the optical source. The first bundled optical fibers guide the light from the optical source to a reflective surface. The second bundled optical fiber is arranged between the first bundled optical fibers. The second bundled optical fiber guides the light reflected from the reflective surface to the optical sensor. The optical sensor converts the reflected light to electrical signals. The microcontroller processes the electrical signals to generate corresponding control signals and transmits the control signals to an external electronic device.05-10-2012
20090002347Pointing Device with Optical Positioning on Low-Diffusive Surfaces - A computer input device includes a light source assembly forming an incident light beam that strikes an area on a surface beneath the input device with large incident angle. A light sensing assembly in the input device is positioned to receive scattered light and includes a band-pass filter. A barrier is located directly above the area where the light beam is incident on the top surface to prevent light from traveling directly from the light source assembly to the light sensing assembly.01-01-2009
20120212460DYNAMIC VIRTUAL REMOTE TAGGING - Method for selecting content items, comprising determining a location in a space, the location depending on at least one of a position, an orientation or a movement of a selection device in a real world environment surrounding the selection device; determining a virtual tag included in a set of virtual tags as a selected virtual tag, the selected virtual tag being associated with the determined location; and selecting a subset of content items of a set of content items, the subset of content items being associated with the selected virtual tag.08-23-2012
Entries
DocumentTitleDate
20080278441USER CONTROL IN A PLAYBACK MODE - A system comprises a display and logic coupled to the display. The logic causes a user control to automatically be displayed on the display upon a user activating an input device in both a full screen playback mode and a preview playback mode.11-13-2008
20090146947UNIVERSAL WEARABLE INPUT AND AUTHENTICATION DEVICE - The object of the wearable input device is to provide the user with one data input device and authentication system that is portable and can be worn like a fashion accessory, such as a watch or bracelet, so as to be unobtrusive to daily activity. The wearable input device can be used to replace home and car lock and security systems, television/VCR/DVD remote controls, personal computer authentication system, credit card authentication systems, automatic teller machine authentication systems, among others.06-11-2009
20100013760VOICE INPUT DEVICE - Provided is a voice input device in which a content spoken by a user is reflected on a confirmation screen when executing the content, thereby allowing the user to confirm that a command, corresponding to the content which is spoken by the user who intended for the execution, is executed after being recognized by the voice input device. The voice input device comprises: a command storage section (01-21-2010
20130044050Causing Display of Comments Associated with an Object - Apparatus is configured to cause to be displayed, in a first area of a display, an object and to cause to be displayed, in a second area of the display, a first comment associated with the object, wherein the second area is in a fixed location relative to the first area. The apparatus is responsive to a first dynamic tactile user input within the second area of the display to cause the first comment to be at least partially hidden and to cause to be displayed, in the second area of the display, a second comment that was not visible prior to the first dynamic tactile user input without moving the object on the display.02-21-2013
20130044052APPARATUS TO RECOGNIZE A STRAIN IN A FLEXIBLE DISPLAY - An apparatus to recognize a strain in a flexible display includes a recognition unit to include a first panel and a second panel that are formed of an Indium Tin Oxide (ITO) film, which is a transparent conductive film coated with uniform electric constant, and an adhesion layer disposed between the first panel and the second panel, in which the recognition unit is connected to the flexible display and outputs an electric potential value according to the strain in the flexible display; a memory to store an operation pattern information that corresponds to a state of the strain of the flexible display; and a control unit to determine the state of the strain according to the electric potential value, and to execute an operation corresponding to the operation pattern information.02-21-2013
20130044049ELECTROACTIVE POLYMER TRANSDUCERS FOR TACTILE FEEDBACK DEVICES - Electroactive transducers as well as methods of producing a haptic effect in a user interface device simultaneously with a sound generated by a separately generated audio signal and electroactive polymer transducers for sensory feedback applications in user interface devices are disclosed.02-21-2013
20130044051IMAGE DISPLAY DEVICE AND METHOD FOR OPERATING THE SAME - An image display device and a method for operating the same are provided. In the method for operating the image display device which can perform near field communication with a mobile terminal, an image is displayed on a display, device information including motion information or position information of the mobile terminal is received based on the near field communication, and a corresponding menu is displayed on the display or a corresponding operation is performed according to the received motion information or position information. This method can improve user convenience when the image display device is used.02-21-2013
20110199293MULTI-ORIENTATION HANDWRITING TRACE INPUT DEVICE AND METHOD FOR ROTATING ITS COORDINATE PLANE - The present invention provides a multi-orientation handwriting trace input device and a method for rotating a coordinate plane thereof, and relates to the field of computer peripheral input devices. The device comprises a coordinate indicator and a data receiver, wherein the coordinate indicator enables trace input within a predefined coordinate plane of the data receiver. The device further comprises a data communication unit and a coordinate plane rotation unit. The data communication unit is provided the data receiver. A processor of the data receiver transfers operation state information of the data communication unit to the coordinate plane rotation unit. The coordinate plane rotation unit rotates the coordinate plane of the data receiver according to the received operation state information of the data communication unit. The present invention can be implemented in a simple and reliable way, and is easy for use by both right-handed and left-handed users. The present invention is adaptive for use at various angles and can find a wide range of applications.08-18-2011
20110199289INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, HANDHELD APPARATUS, AND CONTROL METHOD - [Object] To provide techniques of an input apparatus, a control apparatus, and the like with which an image displayed on a screen can be prevented from being moved unintentionally in a case where a user operates an operation section provided to the input apparatus.08-18-2011
20120200495Autostereoscopic Rendering and Display Apparatus - An apparatus comprising a sensor configured to detect the position and orientation of a user viewpoint with respect to an auto-stereoscopic display; a processor configured to determine a surface viewable from the user viewpoint of at least one three dimensional object; and an image generator configured to generate a left and right eye image for display on the auto-stereoscopic display dependent on the surface viewable from the user viewpoint.08-09-2012
20120200487METHOD AND APPARATUS FOR A MULTI-PAGE ELECTRONIC CONTENT VIEWER - An apparatus and method of displaying content on a multi-page electronic content viewer is provided. Sensor input associated with at least one display panel from a plurality of displays panels infinitely rotatable around a common edge of a binding element is determined. The movement is used to determine a direction of movement of the at least one display panel around an axis of the binding element based upon the sensor input. A request is then sent for content to a storage device for content to be displayed on the next viewable display. The request includes a direction indicator determined by the direction of movement of the at least one display panel around the binding element. Content is then received and displayed on display screens of one or more of the plurality of display panels.08-09-2012
20080211766MULTITOUCH DATA FUSION - A method for performing multi-touch (MT) data fusion is disclosed in which multiple touch inputs occurring at about the same time are received to generating first touch data. Secondary sense data can then be combined with the first touch data to perform operations on an electronic device. The first touch data and the secondary sense data can be time-aligned and interpreted in a time-coherent manner. The first touch data can be refined in accordance with the secondary sense data, or alternatively, the secondary sense data can be interpreted in accordance with the first touch data. Additionally, the first touch data and the secondary sense data can be combined to create a new command.09-04-2008
20100117954E-paper display control based on conformation sequence status - A method for one or more portions of one or more regions of an electronic paper assembly having one or more display layers includes, but is not limited to: obtaining first information regarding one or more positions of one or more portions of one or more regions of the electronic paper assembly and sending one or more electronic paper assembly physical status related information portions to the electronic paper assembly based upon the obtaining of the first information. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure.05-13-2010
20100117955E-paper display control based on conformation sequence status - A system for one or more portions of one or more regions of an electronic paper assembly having one or more display layers includes, but is not limited to:one or more position obtaining modules configured to direct obtaining first information regarding one or more positions of one or more portions of one or more regions of the electronic paper assembly and one or more physical status sending modules configured to direct sending one or more electronic paper assembly physical status related information portions to the electronic paper assembly based upon the obtaining of the first information. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure.05-13-2010
20090231274Tilt Roller for Control Device - A control device includes a roller configured to rotate and tilt; a roller support coupled to the roller, wherein the roller is configured to rotate relative to the roller support; the first hinge disposed adjacent to a first end of the roller support; and a second hinge disposed adjacent to a second end of the roller support, wherein the first end and the second end are substantially opposite ends of the roller support, the second hinge is above the first hinge, and the first hinge and the second hinge are configured to provide tilting support for the roller and roller support.09-17-2009
20090046055Interactive digital image display - An interactive digital image display has a screen, an image processor, a coordinate detecting unit coupled to the image processor, an image driving unit coupled to the image processor, a memory coupled to the image processor, and an interface for acquiring images. When user moves the display, the coordinate detecting unit generates a signal to the image processor, and the image displayed on the display will be adjusted according to the displacement of the display02-19-2009
20120162059HANDWRITING INPUT DEVICE WITH CHARGER - A handwriting input device includes a hollow charger and a handwriting pen. The charger includes a number of first contacts and a connecting unit. The handwriting pen includes a number of second contacts, a communication unit, a light source to emit light, an optical sensor, and a microcontroller. When the charger is connected to an external electronic device through the connecting unit, and the handwriting pen is inserted into the charger to cause each second contact to be contacted with one first contact, the handwriting pen can obtain power from the external electronic device. When the charger is moved, together with the handwriting pen being charged, the optical sensor converts the light reflected from a reflective surface under the charger into the electrical signals. The microcontroller generates and transmits the cursor control signals or the track control signals to the external electronic device.06-28-2012
20110205150ELECTRONIC DEVICE - According to one embodiment, an electronic device includes a main body unit, a hard disk drive housed inside the main body unit and including a head which performs reading and writing of data to a magnetic disk, a display unit pivotable between a first position where the display unit is laid parallel to the main body unit and a second position where the display unit is raised relative to the main body unit, a sensor which senses an angle at which the display unit is positioned, and a control unit. The control unit retracts the head to a retraction position when the sensor senses a change in the angle of the display unit.08-25-2011
20100007601GAZE INTERACTION FOR INFORMATION DISPLAY OF GAZED ITEMS - An interactive method and system include at least one detector (01-14-2010
20090195497GESTURE-BASED POWER MANAGEMENT OF A WEARABLE PORTABLE ELECTRONIC DEVICE WITH DISPLAY - Methods and systems for providing gesture-based power management for a wearable portable electronic device with display are described. An inertial sensor is calibrated to a reference orientation relative to gravity. Motion of the portable device is tracked with respect to the reference orientation, and the display is enabled when the device is within a viewable range, wherein the viewable range is a predefined rotational angle range in each of x, y, and z axis, to a user based upon a position of the device with respect to the reference orientation. Furthermore, the display is turned off if an object is detected within a predetermined distance of the display for a predetermined amount of time.08-06-2009
20090102786METHOD FOR TESTING AND PAIRING WIRELESS PERIPHERAL DEVICE - The present invention relates to a method for testing and pairing a wireless peripheral device. Different communication channels and different identification codes are used for testing and pairing the wireless peripheral device. As a result, the erroneous pairing relation the wireless input device and the wireless transceiver of the wireless peripheral device is avoided.04-23-2009
20120169592INFORMATION PROCESSING DEVICE, METHOD FOR CONTROLLING INFORMATION PROCESSING DEVICE, PROGRAM, AND INFORMATION STORAGE MEDIUM - Direction obtaining means (07-05-2012
20120169588APPARATUS AND METHOD FOR ADJUSTING FOR INPUT LATENCY IN AN ELECTRONIC DEVICE - Embodiments of methods, apparatuses, devices and systems associated with adjusting for input latency within an electronic are disclosed. An electronic device may receive a user input, such as a user actuation of a device key. A latency adjusted time of the input may be calculated based, at least in part, on a latency of the electronic device in determining the user actuation of the device key. The latency adjusted time may be used to determine a result of the user input.07-05-2012
20100149093VIRTUAL REALITY SYSTEM INCLUDING VIEWER RESPONSIVENESS TO SMART OBJECTS - Embodiments of the invention include a virtual reality system that includes an instrumented device used to present a virtual shopping environment to a simulation participant. The participant's interactions with the virtual shopping environment may be used to conduct market research into the consumer decision making process. The virtual shopping environment may include one or more smart objects configured to be responsive to participant interaction. The virtual shopping environment may recreate a real-world shopping environment.06-17-2010
20100149092IDENTIFYING CONTACTS ON A TOUCH SURFACE - Apparatus and methods are disclosed for simultaneously tracking multiple finger and palm contacts as hands approach, touch and slide across a proximity-sensing, multi-touch surface. Identification and classification of intuitive hand configurations and motions enables unprecedented integration of typing, resting, pointing, scrolling, 3D manipulation and handwriting into a versatile, ergonomic computer input device.06-17-2010
20100149091Media Action Script Acceleration Method - Exemplary apparatus, method, and system embodiments provide for accelerated hardware processing of an action script for a graphical image for visual display. An exemplary method comprises: converting a plurality of descriptive elements into a plurality of operational codes which at least partially control at least one processor circuit; and using at least one processor circuit, performing one or more operations corresponding to an operational code to generate pixel data for the graphical image. Another exemplary method for processing a data file which has not been fully compiled to a machine code and comprising interpretable descriptions of the graphical image in a non-pixel-bitmap form, comprises: separating the data file from other data; parsing and converting the data file to a plurality of hardware-level operational codes and corresponding data; and performing a plurality of operations in response to at least some hardware-level operational codes to generate pixel data for the graphical image. Exemplary embodiments also may be performed automatically by a system comprising one or more computing devices.06-17-2010
20090207129Providing Haptic Feedback To User-Operated Switch - Systems and methods are disclosed herein for generating haptic feedback, tactile feedback, or force feedback to an electromechanical switch that is toggled by a user. In one specific example among many possible embodiments, a switch feedback system is disclosed. The switch feedback system comprises a user-operated switch, which is operable to toggle between one of an open state and a closed state. The switch feedback system also includes electrical circuitry in electrical communication with the user-operated switch, wherein the electrical circuitry is configured to react to a change of state of the user-operated switch. The system also includes a haptic feedback device in electrical communication with the user-operated switch and in physical communication with the user-operated switch. The haptic feedback device is configured to detect the change of state of the user-operated switch and provide a haptic feedback to the user-operated switch in response to the detected change of state.08-20-2009
20090237355HEAD TRACKING FOR VIRTUAL REALITY DISPLAYS - A tracking device for determining position of at least one user relative to a video display has a wearable structure configured to be mounted on a human such a as a headset, eyeglasses or arm bands. The structure has two clusters of light emitting components which are spaced apart from one another. The LEDs in each cluster can emit different wavelengths of lines and be activated in sequences to identify not only the position of the user but also to distinguish one user from another user.09-24-2009
20090085864METHOD AND SYSTEM FOR GESTURE CLASSIFICATION - The present invention is a method of identifying a user's gestures for use in an interactive game application. Videocamera images of the user are obtained, and feature point locations of a user's body are identified in the images. A similarity measure is used to compare the feature point locations in the images with a library of gestures. The gesture in the library corresponding to the largest calculated similarity measure which is greater than a threshold value of the gesture is identified as the user's gesture. The identified gesture may be integrated into the user's movements within a virtual gaming environment, and visual feedback is provided to the user.04-02-2009
20080259025DEVICE FOR USER INTERFACE USING ROTATABLE INPUT DEVICE AND METHOD OF OPERATING USER INTERFACE - Provided are a device for a user interface for efficient navigation and a method of operating the device interface. The device for a user interface includes a rotatable input device which is manipulated by a user, a contact detection device which detects whether the user contacts the rotatable input device and generates a first input signal if it is detected that that the user contacts the rotatable input device, a rotation detection device which detects a rotation of the input device and generates a second input signal if the rotation of the input device is detected, and a control device which performs a first operation based on at least one of the first and second input signals.10-23-2008
20080259024METHOD FOR PROVIDING GRAPHICAL USER INTERFACE (GUI) AND ELECTRONIC DEVICE THEREOF - A method for providing a Graphical User Interface (GUI) and an electronic device using the same are provide. The method for providing a GUI includes receiving rotation information from an external input device by sensing movement of the external input device, and changing a display state of information output on a display using the rotation information of the external input device. Therefore, a user can output setup information conveniently, and user's convenience is provided.10-23-2008
20080259023Method and System of Making a Computer as a Console for Managing Another Computer - A video signal processor encodes a video signal received from the target computer through the computer interface. A peripheral interface controller is coupled to the video signal processor, and transmits the encoded video signal to the console computer in a peripheral interface standard through the console interface. A console module is coupled to the console computer, and decodes the encoded video signal for displaying on the console computer and transmits a manipulation signal to the target computer through the console interface, the peripheral interface controller and the computer interface. A method of making the console computer as a console for managing the target computer is also disclosed.10-23-2008
20080259022Method, system, and graphical user interface for text entry with partial word display - A computer-implemented method for text entry includes receiving entered text from a user, selecting a set of candidate sequences for completing or continuing the sequence, and presenting the candidate sequences to the user, wherein the candidate sequences include partial words. The candidate sequences are identified based on usage frequency weights stored in a tree data structure. A graphical user interface for text entry includes displaying a current input sequence of characters and the identified partial words.10-23-2008
20110193774INFORMATION PROCESSING DEVICE - In an operation control device for controlling the operation of an appliance such as television receiver, a human-perceiving sensor is used as a unit for detecting movements of the operator. Then, continuous movements are extracted therefrom, such as hand-waving movements performed by the operator with an intention of operating the appliance such as television receiver. Moreover, if the operator performs one and the same movement during a time-period which is longer than a certain constant time-period, a control determined in advance is exerted over the operation control device. Also, the situation in which the movements of the operator are detected is displayed on the appliance such as television receiver, so that the operator is permitted to perform a more accurate movement by the resultant feedback effect.08-11-2011
20110193773HEADS-UP DISPLAY FOR A GAMING ENVIRONMENT - A heads-up display associated with a user manipulated object in a gaming environment, particularly a racing game. According to one aspect the heads-up display may reduce the distance a user's eye travels between a focal area and the heads-up display.08-11-2011
20110193772Biaxial rotary display module - A biaxial rotary display module is disclosed, which includes a first casing, a rail disposed on the first casing, a rotary piece arranged in the rail, a second casing having s display panel, and two hinges. The rotary piece has a pivoting portion pivoted to the first casing, and the rotary piece is rotated in the rail. The hinges are disposed on opposite sides of the pivoting portion for connecting the second casing and the rotary piece, thereby causing the second casing able to be pivoted relative to the first casing via the two hinges and swiveled relative to the first casing via the rotary piece.08-11-2011
20130076612ELECTRONIC DEVICE WITH WRAP AROUND DISPLAY - A consumer electronic product includes at least a transparent housing and a flexible display assembly enclosed within the transparent housing. In the described embodiment, the flexible display assembly is configured to present visual content at any portion of the transparent housing.03-28-2013
20120176307TELEPHONE BOOK DATA PROCESSOR - A telephone book data processor includes: a connection element for connecting to an external device via a short range communication manner to transfer a telephone book data; a telephone book data obtaining element for obtaining the telephone book data; a memory having multiple memory regions for storing the telephone book data; and a controller for executing a telephone book data transfer process and a telephone book data utilizing process. The controller defines one memory region as an object of the telephone book data transfer process and another memory region as an object of the telephone book data utilizing process. The controller executes the telephone book data utilizing process with using the telephone book data in the another memory region while the controller executes the telephone book data transfer process for storing a new telephone book data in the one memory region.07-12-2012
20130076613POWERED MARKING APPARATUS FOR POINTING CONTROL - Certain aspects relate to a cordless powered light marking apparatus that can be used with a pointing device and information processor to enable a program to be executed by the information processor. In certain embodiments, the cordless light marking apparatuses are “solar-powered.” In some cases, the cordless light marking apparatus may be self-powered in a different manner, such as via rechargeable batteries (e.g., lithium ion), alkaline batteries, etc. Other aspects relate to applications of the cordless light marking apparatuses. For example, multiple display devices connected directly or indirectly to one information processor may each have a light marking apparatus, enabling multiple players to interact substantially simultaneously with the same information processor through different display devices and different pointing devices. Further, the information processor may be accessed at different times through different display devices, without needing to move the processor or light marking apparatus.03-28-2013
20130076615INTERFACE METHOD AND APPARATUS FOR INPUTTING INFORMATION WITH AIR FINGER GESTURE - An interface apparatus and method is disclosed for inputting information with a user's finger gesture while the user is driving with his two hands on a steering wheel. In one aspect, the interface apparatus includes a gesture sensor, a gesture processor, and a head-up display (HUD), where the gesture sensor and the gesture processor recognizes and interpret the information input by the user's finger gesture and such information is displayed on the HUD. In one embodiment, a plurality of point of interest (POI) icons are displayed on the HUD after the user inputs at least one letter into the system and the user can select the POI by his/her finger gesture. In another embodiment, the gesture sensor can recognize the user's finger gesture in non-alphabet characters.03-28-2013
20130076616ADAPTIVE TRACKING SYSTEM FOR SPATIAL INPUT DEVICES - An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.03-28-2013
20130076614ACCESSORY DEVICE - Accurate and reliable techniques for determining information of an accessory device in relation to an electronic device are described.03-28-2013
20130076617ADAPTIVE TRACKING SYSTEM FOR SPATIAL INPUT DEVICES - An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.03-28-2013
20100001948FOOT-OPERATED COMPUTER INPUT DEVICE - A foot operated data entry/input pad has a plurality of foot-operated buttons. The foot buttons may be used to enter data values, such as numbers or symbols separately or in combination. Each button is preferably capable of entering different data values, preferably depending on the length of time that it is pressed or on the number of times that it is pressed in succession. A small controller may be included to allow the user to control the computer's pointer, allowing the user to switch between data entry fields. A heel rest may serve as both a heel rest and a button/switch for sending an electric/electronic signal. An automated voice system, or other audible and/or visual indicator system, may help the user keep track of the data value as it changes and is entered. In alternative versions for input of instructions, single values or binary information, or for selection of items in a pull-down screen window, a pad may have two buttons provided adjacent a cursor controller, wherein the cursor controller and right and left click buttons are on an arc or on an angle.01-07-2010
20100073285DISPLAY DEVICE FOR DISPLAYING A SYSTEM STATE - A display device for displaying a system state has an output unit for reproduction of information characteristic of the system state, a sensor for detecting a distance between a user of the display device and the output unit, and a control unit for processing the reproduction of the information in the output unit as a function of the detected distance.03-25-2010
20100073284SYSTEM AND METHOD FOR ANALYZING MOVEMENTS OF AN ELECTRONIC DEVICE - The disclosure relates to a system and method for analyzing movements of a handheld electronic device. The system comprises: memory; a microprocessor; a first module to generate movement data responsive to movements of the device; a second module providing instructions to the microprocessor to map the movement data to a string representation relating to symbols in a spatial coordinate system associated with the device and store the string representation in the memory; and a third module. The third module provides instructions to the microprocessor to analyze data relating to the string representation against data relating to a gesture string representing a gesture related to a command for the device to determine if the gesture has been imparted on the device; and if the string representation sufficiently matches the gesture string, executes a command associated with the gesture on the device.03-25-2010
20100073283CONTROLLER WITH USER-SELECTABLE DISCRETE BUTTON EMULATION - A user device with a position control device such as a thumbstick may be used to emulate discrete button presses via user selection of a mode switch on the device.03-25-2010
20130076621DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A display apparatus includes: an image receiving unit which receives a content; a display unit which displays the received content; an image pickup unit which captures images of a user; a storage unit which stores the content and at least one of the captured images of the user; and a control unit which displays a portion of the content with the at least one captured image from among the captured images of the user.03-28-2013
20130076619Methods and Apparatus for Freeform Deformation of 3-D Models - Methods and apparatus for interactive curve-based freeform deformation of three-dimensional (3-D) models may provide a user interface that allows a user to interactively deform 3-D models based on simple and intuitive manipulations of a curve drawn on the model (i.e., freeform deformation). The user may apply freeform deformations using touch and/or multitouch gestures to specify and manipulate a deformation curve. The deformations may be applied by deforming the space around a curve/sweep path and deforming the 3-D model accordingly. The freeform deformation methods are not dependent on manipulation of a fixed set of parameters to perform deformations, and may provide for both local and global deformation. One or more weights and user interface elements for controlling those weights may be provided that allow the user to control the extent (region of influence) of the freeform deformations along the curve and/or perpendicular to the curve.03-28-2013
20130076622METHOD AND APPARATUS FOR DETERMINING INPUT - An apparatus, comprising a processor, a memory including computer program code, the memory and the computer program code configured to, working with the processor, cause the apparatus to perform at least the following: receiving a first image, recognizing at least part of the first image as a command receiver, recognizing at least part of the first image as an input article, determining that at least part of the input article is associated with at least part of the command receiver, and causing display of a guidance associated with the command receiver is disclosed.03-28-2013
20130076620PROJECTION APPARATUS, PROJECTION CONTROL METHOD AND STORAGE MEDIUM STORING PROGRAM - A projector apparatus includes a light source, a projection unit to form an optical image by using the light source and to project the formed image, an input unit to input a video signal representing the image, a person detection unit to detect a person existing in a projection area, a first projection control unit to set the projector apparatus to a projection state using a preset illumination pattern, when a person has been detected in the projection area, a motion analyzing unit to analyze a person's motion in the projection area, and a second projection control unit to activate the motion analyzing unit, after setting the projector apparatus to the projection state using the preset illumination pattern.03-28-2013
20130082917APPARATUS CONFIGURED TO HAVE MULTIPLE USER INTERFACES AND METHOD - An apparatus comprising: a controller; a first part; a second part; an arrangement configured to add a third part between the first part and the second part to change a configuration of the apparatus from a first configuration in which the first part and the second part are adjacent and there is no third part between the first part and the second part, to a second configuration in which the first part and the second part are separated by the third part and the first part and the third part are adjacent, wherein the controller is configured to pair the first part and the second part as a user interface when the apparatus is in the first configuration, and is configured to pair the first part and the third part as a user interface when the apparatus is in the second configuration.04-04-2013
20130082922TACTILE GLOVE FOR HUMAN-COMPUTER INTERACTION - One embodiment is directed to a system for human-computer interface, comprising an input device configured to provide two or more dimensions of operational input to a processor based at least in part upon a rubbing contact pattern between two or more digits of the same human hand that is interpreted by the input device. The input device may be configured to provide two orthogonal dimensions of operational input pertinent to a three-dimensional virtual environment presented, at least in part, by the processor. The input device may be configured to detect the rubbing between a specific digit in a pen-like function against one or more other digits. The input device further may be configured to detect the rubbing between the specific digit in a pen-like function and one or more other digits in a receiving panel function.04-04-2013
20130082916METHODS, APPARATUSES, AND COMPUTER PROGRAM PRODUCTS FOR IMPROVING DEVICE BEHAVIOR BASED ON USER INTERACTION - Methods, apparatuses, and computer program products are herein provided for improving operation of a device based upon user interaction. A method may include receiving user input. The method may further include determining a user state value based at least in part on the received user input, wherein the user state value corresponds to a patience level of the user with the current rate of operation of the device. The method may further include causing modification in the operation of the device based at least in part on comparison of the user state value to a threshold user state value. Corresponding apparatuses and computer program products are also provided.04-04-2013
20130082921BARRIER PANEL, AND 3D IMAGE DISPLAY DEVICE AND METHOD USING THE SAME - A barrier panel, and a 3D image display device and method are provided. The barrier panel is disposed corresponding to an image panel including a plurality of pixels and is configured such that M×N barrier regions of the barrier panel are formed at each pixel of the image panel. The brightness of each barrier region is adjusted to display a 3D image from the image panel.04-04-2013
20130082920CONTENT-DRIVEN INPUT APPARATUS AND METHOD FOR CONTROLLING ELECTRONIC DEVICES - A content-driven apparatus for controlling electronic devices integrates all control command functions for at least one controlled electronic device. Content information transmitted from one of the controlled electronic devices is received by a communication module, and is passed to a processing element for parsing the content information, including types of the content information, desired command actions to be proceeded, controlled electronic devices required to cooperate, and how to operate for a user. The processing element decides a user interface and an operation method for the user after the parsing, and issues corresponding control messages to the controlled electronic devices required to cooperate after the user uses the operation method to select specific control commands.04-04-2013
20110310004APPARATUS AND METHOD FOR SETTING A PARAMETER VALUE - Embodiments of the present invention relate to an improved man-machine interface for an apparatus. The interface comprises at least one graphical representation of a controllable parameter.12-22-2011
20110310003IMAGE DISPLAY DEVICE AND METHOD OF DISPLAYING IMAGES - An image display device includes an autostereoscopic screen for simultaneously displaying a plurality of different images which are visible from in each case at least one of different laterally offset viewing zones and a control unit for controlling the screen in dependence on image information of the different images, wherein the screen has a matrix screen with a plurality of pixels arranged in columns and rows as well as a grating arranged in front of the matrix screen and having a structure orientated parallel to the columns to direct light emanating from the pixels of the matrix screen into the different viewing zones. The image display device furthermore has a tracking device for detecting two respective eye positions of at least two viewers of the screen, wherein the control unit is configured for inputting input commands.12-22-2011
20110310002FREE SPACE DIRECTIONAL FORCE FEEDBACK APPARATUS - A directional feedback device generating a directional force feedback in free space. The device includes a force generation structure including a rotatable mass creating a physical force vector in three dimensional space. A wireless communication device and a control system communicatively coupled to the wireless communication device and the force generation system are provided. The control system receives a definition of the physical force vector to be generated from an application executing in a processing device. The control system provides instructions to the force generation system to generate the physical force vector using the force generation structure. The force generation system, the wireless communication device and the control system are encloses in a housing.12-22-2011
20110310001DISPLAY RECONFIGURATION BASED ON FACE/EYE TRACKING - An adaptive interface system includes a user interface providing a visual output, a sensor for detecting a vision characteristic of a user and generating a sensor signal representing the vision characteristic, and a processor in communication with the sensor and the user interface, wherein the processor receives the sensor signal, analyzes the sensor signal based upon an instruction set to determine the vision characteristic of the user, and reconfigures the visual output of the user interface based upon the vision characteristic of the user to highlight at least a portion the visual output within a field of focus of the user.12-22-2011
20130033419METHOD AND SYSTEMS FOR THREE-DIMENSIONAL IMAGE SEGMENTATION - A method for error correction for segmented three dimensional image data. The method includes receiving segmented three dimensional image data, the segmented three dimensional image data being divided into a plurality of slices; correcting at least one contour of the segmented three dimensional image data on at least one slice according to a command from a user to form a corrected contour; and automatically interpolating a correction represented by the corrected contour to a plurality of slices of the segmented three dimensional image data.02-07-2013
20130033421CHANGEABLE INTERACTIVE SURFACE WITH RESIZING, ZOOMING, MOVING, ROTATING AND OTHER MANIPULATION OPTIONS/CONTROLS - Systems and methods to allow a user of an electronic device to change the size, rotation, zoom and/or position of a drawing surface used with a graphics-based user interface, using characteristic changing controls positioned within the drawing surface, adjacent to the drawing surface or both within and adjacent to the drawing surface.02-07-2013
20130033422ELECTRONIC APPARATUS USING MOTION RECOGNITION AND METHOD FOR CONTROLLING ELECTRONIC APPARATUS THEREOF - An electronic apparatus and controlling method thereof is disclosed. The method for controlling the electronic apparatus includes using motion recognition photographs as an object, and changing and displaying a screen based on a movement direction of the object, when a determination that the photographed object is moved while maintaining a first shape is made. By this method, the user is able to perform zoom in and zoom out operations more easily and intuitively by using motion recognition.02-07-2013
20130033420SLEEVE AND CONTROL DEVICE WITH SUCH SLEEVE - A sleeve of control device is provided for controlling a cursor motion of an electronic device. A touch-feel enhancing mechanism is formed on an outer surface of the sleeve. The touch-feel enhancing mechanism is not related to the function of operating the control device to detect the rotating action or the moving action by the user. The touch-feel enhancing mechanism is only used to enhance comfort and touch feel of operating the control device.02-07-2013
20130033418GESTURE DETECTION USING PROXIMITY OR LIGHT SENSORS - Example methods, apparatuses, or articles of manufacture are disclosed that may be utilized, in whole or in part, to facilitate or support one or more operations or techniques for gesture detection using, at least in part, output or measurement signals from one or more ambient environment sensors, such as, for example, a proximity sensor or ambient light sensor.02-07-2013
20130076618COMPUTER-READABLE STORAGE MEDIUM HAVING STORED THEREIN DISPLAY CONTROL PROGRAM, DISPLAY CONTROL SYSTEM, DISPLAY CONTROL APPARATUS, AND DISPLAY CONTROL METHOD - An exemplary information processing apparatus selectively switches between: first control where control is performed such that, in a virtual space, a position of producing no parallax on a screen of a stereoscopic display is a first position near a predetermined object; and second control where control is performed such that the position of producing no parallax is closer to a viewpoint position of virtual cameras than the first position is.03-28-2013
20130038527VIDEO DATA PROCESSING APPARATUS AND METHOD - An information processing apparatus (02-14-2013
20130038528Method and Apparatus for User Interface Communication with an Image Manipulator - A system, and method for use thereof, for image manipulation. The system may generate an original image in a three dimensional coordinate system. A sensing system may sense a user interaction with the image. The sensed user interaction may be correlated with the three dimensional coordinate system. The correlated user interaction may be used to project an updated image, where the updated image may be a distorted version of the original image. The image distortion may be in the form of a twisting, bending, cutting, displacement, or squeezing. The system may be used for interconnecting or communicating between two or more components connected to an interconnection medium (e.g., a bus) within a single computer or digital data processing system.02-14-2013
20130038524IMAGE PICKUP DEVICE AND PROJECTOR - An image pickup device includes an image pickup element which captures an image of a projection surface, an optical filter which has higher transmissivity for infrared light than for visible light, a visible light transmitting member whose transmissivity for visible light is higher than the corresponding transmissivity of the optical filter, and a switching unit which switches between a first condition where the optical filter is disposed on an optical path of light entering the image pickup element, and a second condition where the visible light transmitting member is disposed on the optical path.02-14-2013
20130038526DOMESTIC APPLIANCE DEVICE - A household appliance apparatus includes a display screen and an input unit which has at least one substantially transparent input device. In order to achieve advantageous ease of use with little design expenditure, the input device covers only a partial region of the display screen. The input device may hereby be configured as a conductive layer.02-14-2013
20130038522DISPLAY APPARATUS, DISPLAY METHOD, AND STORAGE MEDIUM - A display apparatus includes an image pickup section, a display section, an instruction section, a position specification section, a direction specification section, a specification section, and a reporting section. The image pickup section sequentially picks up an image. The display section displays the picked up image. The instruction section generates an instruction signal for marking an object included in the image. The position specification section specifies a position where the display apparatus exists. The direction specification section specifies a pickup direction by the image pickup section. The specification section specifies a position of the object relative to the position of the display apparatus based on the position of the display apparatus and the pickup direction in response to the instruction signal. The reporting section reports the position of the object.02-14-2013
20130038525VEHICULAR DISPLAY SYSTEM AND A METHOD FOR CONTROLLING THE DISPLAY SYSTEM - A vehicular control system including a plurality of displays each including a physical display unit and memory module including information related to descriptions of a plurality of displayable entities and first configuration data, associated to a first control module. The first configuration data including information on configuration of the displayable entities, based on references to the descriptions of the plurality of displayable entities. The memory module of each display comprises a copy of the first configuration data. The first control module is arranged to transmit data based on point-to-multipoint communication to each of the plurality of displays. The plurality of displays each includes a processor operatively coupled to the memory module. The processor is arranged to process the entities based on received data from the first control module and to present the result of the processing on the physical display unit of each of the displays.02-14-2013
20130038523Character Input Device, Character Input Device Control Method, And Information Storage Medium - Methods and apparatus provide for displaying a plurality of character key images in a first display area on the computing device, each character key image representing and being associated with at least one of the character groups, each character group including a plurality of characters; and displaying, in a case where one of the character key images is selected by a user of the computing device, a plurality of input candidate characters in a second display area on the computing device, the plurality of input candidate characters including at least some of the plurality of characters in the character group associated with the selected one of the character key images, the second display area being set in a partially overlapping manner over the first display area.02-14-2013
20130038520AUTOMATIC SHUTDOWN OF 3D BASED ON GLASSES ORIENTATION - Devices, systems, and methods are presented for shutting down the 3D effect of active shutter 3D glasses by synchronizing the transparency of the lenses with respect to each other when the 3D glasses have been rotated beyond a threshold angle. The threshold angle can be pre-set through a user-selectable switch. Transitioning from an alternating shutter mode to a synchronized shutter mode can include a fade in which the duty cycles of the lens are adjusted. Direct measurement techniques for measuring the differential roll angle between the lenses and left and right eye images on a display are disclosed.02-14-2013
20130038519GRAPHICAL INTERACTIVE VISUAL RESPONSE SYSTEM AND METHOD - A graphical interactive visual response system and method is provided in which a graphical user interface provides such interactivity and visual response. A user can initiate contact with a representative using an application residing on a personal device, such as a mobile telephone or computer. The application (graphical interface) allows the user to interactively select options on a displayed menu, arrive at the appropriate service and initiate a connection with the representative. The connection is established when the user and representative are available, thereby avoiding hold and wait times typically associated with conventional interactive voice response system.02-14-2013
20130038521SYSTEMS AND METHODS OF CAMERA-BASED FINGERTIP TRACKING - Systems and methods for camera-based fingertip tracking are disclosed. One such method includes identifying at least one location of a fingertip in at least one of the video frames, and mapping the location to a user input based on the location of the fingertip relative to a virtual user input device.02-14-2013
20100045593DIRECTIONAL INPUT DEVICE - The object of the invention is to provide a directional input device that changes the distance between electrodes that are included in a capacitative element in correspondence with a sliding direction (that is, input direction) to which an input unit is made to slide, thereby changing the electrostatic capacity of the capacitative element.02-25-2010
20100045595SYSTEM AND METHOD FOR CONTROLLING A DISPLAYED PRESENTATION, SUCH AS A SEXUALLY EXPLICIT PRESENTATION - A hand-held or hand-attached computer control device and associated system is utilized to control graphical objects in a computer-driven display in which the motion, type of behavior, and attributes of the graphical object are controlled through movement and resulting accelerations of the control device, such that the individual is provided with control during sexual self-stimulation.02-25-2010
20100045594SYSTEMS, METHODS, AND DEVICES FOR DYNAMIC MANAGEMENT OF DATA STREAMS UPDATING DISPLAYS - Presented herein are methods, systems, devices, and computer-readable media for systems for dynamic management of data streams updating displays. Some of the embodiments herein generally relate to presenting video image data on an array of tiled display units, thereby allowing the display of much larger images than can be shown on a single display. Each display unit can include a video image display, a communication mechanism, such as a network interface card or wireless interface card, and a video image controller, such as a graphics card. Attached to the tiled display may be one or more user computers or other sources of video image data. A workstation may also be coupled to the tiled display and to the user computers. Each of the user computers can display data or images on the tiled display simultaneously. Since the tiled display is made up of multiple display units, the images from a single user computer may be on multiple, separate individual display units. The images from multiple user computers could also be shown on the same display unit and they may even overlap.02-25-2010
20090179855Multifunctional Operating Device and Method - A multifunctional operating device is provided that includes a display unit, at least one operating element, to which a function can be assigned within a menu structure and can be selected when the respective operating element is actuated, and associated visual information on the function selected by the operating element. The visual information is displayed on the screen during at least one operating status of the operating element. The time during which the associated visual information is displayed depends on at least one display parameter which can be varied by way of an adjusting intervention.07-16-2009
20120206337MULTIPLE CAMERA CONTROL SYSTEM - A multiple camera tracking system for interfacing with an application program is provided. The tracking system includes multiple cameras arranged to provide different viewpoints of a region of interest, and are operable to produce a series of video images. A processor is operable to receive the series of video images and detect objects appearing in the region of interest. The processor executes a process to generate a background data set from the video images, generate an image data set for each received video image, compare each image data set to the background data set to produce a difference map for each image data set, detect a relative position of an object of interest within each difference map, and produce an absolute position of the object of interest from the relative positions of the object of interest and map the absolute position to a position indicator associated with the application program.08-16-2012
20100097312SYSTEM AND METHOD FOR LIGHT CONTROL - A system for light control, comprising a light source (04-22-2010
20130027296COMPOUND GESTURE-SPEECH COMMANDS - A multimedia entertainment system combines both gestures and voice commands to provide an enhanced control scheme. A user's body position or motion may be recognized as a gesture, and may be used to provide context to recognize user generated sounds, such as speech input. Likewise, speech input may be recognized as a voice command, and may be used to provide context to recognize a body position or motion as a gesture. Weights may be assigned to the inputs to facilitate processing. When a gesture is recognized, a limited set of voice commands associated with the recognized gesture are loaded for use. Further, additional sets of voice commands may be structured in a hierarchical manner such that speaking a voice command from one set of voice commands leads to the system loading a next set of voice commands.01-31-2013
20130027295INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM - There is provided an information processing apparatus including an acquisition unit acquiring data on at least one of an acceleration or an angular velocity of a controller operated by a user, and a determination unit determining at least one of a velocity of the controller or a trajectory of the controller based on the acquired data on the at least one of the acceleration or the angular velocity.01-31-2013
20130027294INPUT APPARATUS, INPUT METHOD, AND CONTROL SYSTEM - There is provided an apparatus including an input apparatus including an input apparatus main body with which input manipulation is performed to manipulate a manipulation target object, a first manipulation detection unit that detects a first manipulation on the input apparatus main body, a second manipulation detection unit that detects a second manipulation on the input apparatus main body after the first manipulation is detected, and a first processing unit that performs first processing for manipulation on the manipulation target object or a first response of the input apparatus, based on a movement detection value corresponding to movement of the input apparatus main body according to the first manipulation or a detection value of the first manipulation.01-31-2013
20130027293STABILISATION METHOD AND COMPUTER SYSTEM - The present invention relates to a method for stabilising a series of measurements of a physical variable captured by a digital sensor. This method comprises the steps of: capturing at least a first measurement, a second measurement, and a third measurement of said physical variable and storing each measurement in a digital memory. The first and second measurements are compared and, if a difference between the first measurement and the second measurement is below a predetermined threshold, the second measurement is replaced in the memory by a corrected second measurement where the difference with respect to said first measurement has been reduced using a first filtering strength. The corrected second measurement and the third measurement are compared and, if a difference between the filtered value of the corrected second measurement and said third measurement is also below the threshold, said the third measurement is replaced by a corrected third measurement where a difference with respect to said corrected second measurement has been reduced using a second filtering strength that is lower than the first filtering strength. This method has the advantage of filtering noise whilst still allowing slow but relevant variations in the series of measurements.01-31-2013
20130027289ELECTRONIC DEVICE - A mobile terminal is provided comprising a communication unit configured to form a network with first and second electronic devices, and a controller configured to control the first electronic device so that the first electronic device plays first content while simultaneously controlling playback of second content when receiving a connection request relating to playback of the second content from the second electronic device while controlling the first electronic device so that the first electronic device plays the first content.01-31-2013
20100039380Movable Audio/Video Communication Interface System - A system that includes a desk top assembly of a display and sensors mounted on a robotic arm. The arm moves the assembly so that it remains within position and orientation tolerances relative to the user's head as the user looks around. Near-field speaker arrays supply audio and a microphone array senses a user's voice. Filters are applied to head motion to reduce latency for arm's tracking of the head. The system is full duplex with other systems allowing immersive collaboration. Lighting and sound generation take place close to the user's head. A haptic interface device allows the user to grab the display/sensor array and move it about. Motion acts as a planar selection device for 3D data. Planar force feedback allows a user to “feel” the data. Users see not only each other through display windows, but can also see the positions and orientations of each others' planar selections of shared 3D models or data.02-18-2010
20100110001INPUT DEVICE AND METHOD AND PROGRAM - An input device includes a detection unit, a first acquisition unit, a second acquisition unit, and a compensation unit. The detection unit is configured to detect an operation by a user for controlling an electronic device and output an operation signal corresponding to the operation. The first acquisition unit is configured to acquire the detected operation signal and a differential value of the operation signal. The second acquisition unit is configured to acquire a function defined by the differential value to compensate for a delay in response of the operation signal with respect to the operation by the user. The compensation unit is configured to compensate the operation signal with the acquired function.05-06-2010
20100109999HUMAN COMPUTER INTERACTION DEVICE, ELECTRONIC DEVICE AND HUMAN COMPUTER INTERACTION METHOD - A human computer interaction device includes a first input device and a second input device, and there are at least a part of the operation regions of the whole function or a certain function of the second input device locating in the key operation region of the first input device. The first input device could be a key-input device and the second input device could be a mouse simulation device. An electronic device and a human computer interaction method using the human computer interaction device are further provided. This invention can reduce the operating regions of both the key input device and the mouse function, and can realize the key input device and mouse function without moving the forearm, thus increasing the work efficiency.05-06-2010
20100110000VISUAL DISPLAY SYSTEM AND METHOD FOR DISPLAYING A VIDEO SIGNAL - A video display system (100) is provided which comprises a display panel (DP) for displaying a video signal (V), at least one lighting unit (LU) for providing a surround or ambient lighting (LS), a user interface (UI) for receiving external user calibration signals and a lighting control unit (LC) for controlling the color and/or luminance of the lighting unit (LU) in dependence on the calibration signals received by the user interface (UI).05-06-2010
20100109998System and method for sensing facial gesture - A method and system of sensing facial gestures are disclosed. The method of sensing facial gestures includes determining a basic database (DB) for sensing facial gestures of a user, estimating a head pose of the user, extracting a mesh model corresponding to the estimated head pose using at least one of a personalized DB and a general DB, and sensing the facial gestures using the extracted mesh model, and reflecting the sensed facial gestures to a virtual object.05-06-2010
20130135194METHODS FOR CONTROLLING AN ELECTRIC DEVICE USING A CONTROL APPARATUS - An electrical switch apparatus including a movement sensitive form is disclosed. The apparatus includes a housing, a motion sensor and a processing unit, where motion on, near or about the motion sensor is translated into output commands adapted for list scrolling, where the list can be arranged in a hierarchy such as menus or for changing a value of an attribute of a electrical device under the control of the switch.05-30-2013
20120262366ELECTRONIC SYSTEMS WITH TOUCH FREE INPUT DEVICES AND ASSOCIATED METHODS - Embodiments of electronic systems, devices, and associated methods of operation are described herein. In one embodiment, a computing system includes an input module configured to acquire images of an input device from a camera, the input device having a plurality of markers. The computing system also includes a sensing module configured to identify segments in the individual acquired images corresponding to the markers. The computing system further includes a calculation module configured to form a temporal trajectory of the input device based on the identified segments and an analysis module configured to correlate the formed temporal trajectory with a computing command.10-18-2012
20120262365OBJECT TRACKING WITH PROJECTED REFERENCE PATTERNS - Systems and methods for tracking an object's position and orientation within a room using patterns projected onto an interior surface of the room, such as the ceiling. Systems include at least one optical position sensitive device embedded in the object to detect relative changes in the projected pattern as the object's position and/or orientation is changed. In particular systems, the pattern includes a plurality of beacons projected by one or more steerable lasers. Projected beacons may be steered automatically to accommodate a variety of room topologies. Additional optical position sensitive devices may be disposed in known physical positions relative to the projected pattern to view either or both the projected pattern and the object. A subset of object positional data may be derived from a video camera viewing the object while another subset of object positional data is derived from the projected pattern.10-18-2012
20130082918METHOD AND APPARATUS PERTAINING TO RESPONSIVELY CHANGING APPLICATION FUNCTIONALITY OF AN ELECTRONIC DEVICE - Detection of a change in physical configuration of a discrete device with respect to an electronic device leads to responsively changing application functionality of the electronic device as a function, at least in part, of information provided by the discrete device. By one approach the detected change can pertain to movement of the discrete device with respect to the electronic device or orientation of a coupling of the discrete device to the electronic device. This information can comprise at least a unique identification code. In such a case these teachings will accommodate using the unique identification code to access a corresponding profile for the discrete device where the profile specifies at least one application to be presently made available using the discrete device.04-04-2013
20130082919METHOD AND APPARATUS PERTAINING TO AUTOMATED FUNCTIONALITY BASED UPON DETECTED INTERACTION BETWEEN DEVICES - Detection of a physical interaction between a first device that is logically coupled to a second device, wherein the physical interaction comprises one of a plurality of physical interactions that involve movement of one of the first device and the second device, results in automatically performing a function that corresponds to the physical interaction. The detected physical interaction can comprise one or more of a physical reorientation of the first device, a pivoting movement between the first device and the second device, a sliding movement between the first device and the second device, and a momentary change in physical proximity of the first device with respect to the second device. This activity can also include determining whether the physical interaction occurs within a predetermined period of time, and when the physical interaction does not occur within the predetermined period of time, prohibiting the automatic performing of the function.04-04-2013
20130135191INPUT DISPLAY APPARATUS, TANGIBLE COMPUTER-READABLE RECORDING MEDIUM AND INPUT DISPLAY METHOD - Disclosed is an input display apparatus including: a handwriting input unit to receive a handwriting input; a particle migration type of display unit to enable display contents to be partially rewritten; and a control unit to control a display operation of the display unit, for displaying each stroke which is input via the handwriting input unit; wherein the control unit controls the display unit so as to display a currently input stroke which is currently input via the handwriting input unit, in a simple display in which a delay required to display the currently input stroke is short as compared with a normal display.05-30-2013
20120212404INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING TERMINAL DEVICE, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND COMPUTER READABLE STORAGE MEDIUM - An information processing device includes an input information obtaining unit configured to obtain first input information and second input information among input information items, an input candidate information obtaining unit configured to obtain one or more input candidate information items as a candidate for the second input information, based on the first input information, an executing unit configured to carrying out an application, based on the one or more input candidate information items, to produce one or more image candidate information items, an image candidate information storage unit configured to store the one or more image candidate information items so as to be correlated to the one or more input candidate information items respectively, and a transmitting unit configured to send information relating to one or all of the one or more image candidate information items stored in the image candidate information storage unit to a terminal device.08-23-2012
201300272973D REMOTE CONTROL SYSTEM EMPLOYING ABSOLUTE AND RELATIVE POSITION DETECTION - The present invention can include three-dimensional remote control systems that can detect an absolute location to which a remote control is pointing in first and second orthogonal axes and an absolute position of the remote control in a third orthogonal axis. Remote control systems of the present invention can employ absolute position detection with relative position detection. Absolute position detection can indicate an initial absolute position of the remote control and relative position detection can indicate changes in the position of the remote control. By combining absolute and relative position detection, remote control systems of the present invention can track remote controls more precisely than systems that only employ absolute position detection. The present invention also can include methods and apparatus for zooming in and out of an image shown on a display based on the absolute position of the remote control in the third axis.01-31-2013
20130027292CONTENT DISPLAY DEVICE - An anchor operation can be facilitated by ten keys using location information on a screen of items to which predetermined tags are attached. Display location information is extracted from the screen of a plurality of items to which predetermined tags described in the content data are attached, the screen on which the content is displayed is partitioned based on the display location information so that the plurality of items exist in one region, and available buttons of the operation device are allocated to the plurality of items of the region and simultaneously an item corresponding to an operated button is selected from the plurality of items.01-31-2013
20130027291TOUCH PANEL AND OPERATION METHOD THEREOF - The present invention relates to a touch panel. The touch panel includes a substrate, a plurality of first traces, a plurality of second traces, and a plurality of sensing pads. The first traces are disposed on the substrate and are parallel to each other along a first direction. The second traces are disposed on the substrate and are parallel to each other along a second direction. The first traces and the second traces interlace with each other to form a plurality of sensing pad areas. Each sensing pad is disposed in each sensing pad area, and is electrically connected to each first trace and each second trace. The present invention further provides a method of operating the same. In the present invention, the thickness of the touch panel is decreased, and the material for the electrodes is economized. Consequently, the costs of the touch panel can be reduced.01-31-2013
20130027290Input Mode of a Device - A device to detect a directional hand gesture, identify an input mode of the device associated with the directional hand gesture to launch the input mode, modify a user interface rendered on a display component of the device based on the input mode, and modify a setting of the sensor based on whether the input mode includes a virtual keyboard.01-31-2013
20100066666DISPLAY APPARATUS AND IMAGE DISPLAY METHOD THEREOF - A display apparatus and an image display method thereof are provided. The display apparatus automatically displays images stored in a memory in a pre-set order, and displays the images in various sequences if a change in position of the display apparatus is detected by a motion sensor. Accordingly, the sequence in which the images are displayed is easily changed.03-18-2010
20100066662IMAGE DISPLAY DEVICE - An image display device makes it possible to easily display a stereoscopically two-dimensional image and to improve its direction effect. A display device includes a first display for displaying a first image on a first screen, an image transmission element that is set in a light path for a display light component of the first image and that transmits the display light component of the first image so that a real image of the first image is displayed on an image forming surface positioned at a space on a side opposite to the first screen as a stray image. Further, the display device includes a. controller for controlling the first display so that at least one icon out of a plurality of icons (A-H) is displayed as a stray image disposed along a virtual path with a predetermined shape set in a real space portion including the first mentioned space.03-18-2010
20090046058SELF-CONTAINED, POCKET-SIZED PRESENTATION APPARATUS - A self-contained, pocket-sized presentation apparatus includes a USB drive having a housing, a memory, a processor, and protective cover. A user input device, wireless transmitter, and power source, are integrally disposed within the cover, the transmitter being operatively engaged with the input device and configured to selectively transmit wireless signals in response to selective user actuation of the input device. A wireless receiver is disposed within the housing of the USB drive, to receive and couple wireless signals from the transmitter to the USB drive. The USB drive is configured to receive the wireless signals from the wireless receiver, to selectively generate Page Up and Page Down instructions responsive thereto, and to send the Page Up and Page Down instructions via the USB plug. The memory is configured to contain computer readable program code therein, in the form of a presentation, and in the form of a portable presentation application.02-19-2009
20090046056Human motion tracking device - A “human motion tracking” device (HMT) that translates natural body movements into computer-usable data. The data is transmitted to a simulation application as if it came from any conventional human interface device (HID). This allows an individual to interact with the application without the need for a conventional computer input device (e.g., a keyboard, mouse, etc.). The HMT captures a user's heading as well as the individual's current stance (i.e. standing, sitting, kneeling, etc.). The HMT accomplishes this by using two sensors known as an accelerometer and magnetometer to produce digital input. The digital data will then be passed from the HMT to the application.02-19-2009
20120182215SENSING MODULE, AND GRAPHICAL USER INTERFACE (GUI) CONTROL APPARATUS AND METHOD - A sensing module, and a Graphical User Interface (GUI) control apparatus and method are provided. The sensing module may be inserted into an input device, for example a keyboard, a mouse, a remote controller, and the like, and may sense a hovering movement of a hand of a user within a sensing area, and thus it is possible to provide an interface to control a wider variety of GUIs, and possible to prevent a display from being covered.07-19-2012
20120182214Position Detecting System and Position Detecting Method - A position detecting system includes an indicating device including a photodiode that detects timing of a light pulse and a transmission unit that transmits a signal representing the timing, a plurality of light sources that emit light pulses to areas acquired by dividing a screen into a plurality of parts at timings unique to the light sources, a reception unit that receives the signal representing the timing, and a control unit that detects a position of the indicating device based on the signal representing the timing.07-19-2012
20120182213Control device with wire control for monitor device - A control device with wire control for a monitor, which includes a control body; an audio plug arranged for plugging into an audio jack of the monitor for controlling one or more functions of the monitor through one or more control button; and a resilience and extendable connecting wire electrically connecting between the control body and the audio plug. Accordingly, a user can control different functions of the monitor at a distance from the position at which the monitor is located which is convenience to the user. In addition, the present invention is easy and convenience to carry and is cost-effective.07-19-2012
20120182212INPUT DEVICE FOR COMPUTER SYSTEM - An input device for a computer system includes a command-input unit, a storage unit and a micro-controller unit. The command-input unit is used for generating a control signal. The storage unit is for reading and storing a first data. The micro-controller unit is electrically connected with the command-input unit and the storage unit fore receiving the control signal or the first data and transmitting the control signal or the first data to a computer system according to a Wi-Fi direct protocol, and receiving a second data from the computer system according to the Wi-Fi direct protocol and transmitting the second data to the storage unit.07-19-2012
20120182211DEVICE AND METHOD OF CONVEYING EMOTION IN A MESSAGING APPLICATION - The present disclosure provides a device and method to convey emotions in a messaging application of a mobile electronic device. An emotional context of text entered into the messaging application is determined and an implied emotional text is presented for at least a portion of the entered text in accordance with the determined emotional context. The emotional context may be determined from captured sensor data captured by one or more sensors.07-19-2012
20120182210INTELLIGENT REAL-TIME DISPLAY SELECTION IN A MULTI-DISPLAY COMPUTER SYSTEM - A system and computer-implemented method for managing a plurality of display devices in a multi-display computer system that includes determining in real-time input information including face direction of a user facing the plurality of display devices, selecting a primary display device of the plurality of display devices using the input information determined, and transferring information to the primary display device as desired by the user.07-19-2012
20130050072DISPLAY DEVICE, CONTROL METHOD OF DISPLAY DEVICE AND PROGRAM - A display device includes a display unit to display a video screen on a display surface, a information detection unit to detect position information of a plurality of detection points arranged on a stationery tool imitating a shape of a stationery, and identification information added to the plurality of detection points respectively and different among the plurality of detection points, and an angle determination unit to determine a rotation angle of the stationery tool on the display surface based on the position information of the plural detection points.02-28-2013
20130050071THREE-DIMENSIONAL IMAGE PROCESSING APPARATUS AND THREE-DIMENSIONAL IMAGE PROCESSING METHOD - In one embodiment, a three-dimensional image processing apparatus includes: an imaging module configured to image a field including a front of a display, the display displays a three dimensional image; and a controller configured to control the display to display an image imaged by the imaging module and a field where the three-dimensional image is recognizable as a three-dimensional body.02-28-2013
20130069866SELECTABLE COMMUNICATION INTERFACE CONFIGURATIONS FOR MOTION SENSING DEVICE - Selectable communication interface configurations for motion sensing devices. In one aspect, a module for a motion sensing device includes a motion processor connected to a device component and a first motion sensor, and a multiplexer having first and second positions. Only one of the multiplexer positions is selectable at a time, where the first position selectively couples the first motion sensor and the device component using a first bus, and the second position selectively couples the first motion sensor and the motion processor using a second bus, wherein communication of information over the second bus does not influence a communication bandwidth of the first bus.03-21-2013
20130069867INFORMATION PROCESSING APPARATUS AND METHOD AND PROGRAM - An apparatus and method provide logic for providing gestural control. In one implementation, an apparatus includes a receiving unit configured to receive a first spatial position associated with a first portion of a human body, and a second spatial position associated with a second portion of the human body. An identification unit is configured to identify a group of objects based on at least the first spatial position, and a selection unit is configured to select an object of the identified group based on the second spatial position.03-21-2013
20130069864DISPLAY APPARATUS, DISPLAY METHOD, AND PROGRAM - There is provided a display apparatus including an observation position detection unit for detecting an observation position of an observer, a generation phase determination portion for determining a generation phase for each viewpoint of a multi-viewpoint image for a plurality of viewpoints depending on the detected observation position, a multi-viewpoint image generation unit for generating a viewpoint image for each viewpoint from a predetermined image depending on the determined generation phase, a display device configured to include a display area having a plurality of pixels arranged thereon, for displaying the viewpoint image for each viewpoint such that the viewpoint image can be observed in each of a plurality of observation areas, and a light beam controller configured to be disposed in front of or behind the display device, for restricting a direction of light beams emitted from the display device or incident on the display device.03-21-2013
20130069861INTERFACE CONTROLLING APPARATUS AND METHOD USING FORCE - An interface controlling apparatus and method using a force may generate content control information by analyzing force input information received from at least one force sensor, and may control content based on the content control information.03-21-2013
20130069863TACTILE FEEDBACK APPARATUS, SYSTEM, AND METHOD OF OPERATING TACTILE FEEDBACK APPARATUS - A tactile feedback apparatus, system, and a method of operating the tactile feedback apparatus. The tactile feedback apparatus may detect a finger of a user touching a disk unit, determine a height at which the disk unit is supported, based on a signal generated by a sensor, and support a lower portion of the disk unit by controlling N driving units to be set at the determined height, thereby providing power sensed by the sensor to the finger of the user touching the disk unit.03-21-2013
20130069860Organizational Tools on a Multi-touch Display Device - A process for enabling objects displayed on a multi-input display device to be grouped together is disclosed that includes defining a target element that enables objects displayed on a multi-input display device to be grouped together through interaction with the target element. Engagement of an input mechanism with one of the target element and a particular one of the objects displayed on the multi-input display device is detected. Movement of the input mechanism is monitored while the input mechanism remains engaged with whichever one of the target element and the particular displayed object that the input mechanism engaged. A determination is made that at least a portion of a particular displayed object is overlapping at least a portion of a target element on the multi-input display device upon detecting disengagement of the input mechanism. As a consequence of disengagement and the overlap, processes are invoked that establish a relationship between the particular displayed object and a position on the target element and that causes transformations applied to the target element also to be applied to the particular displayed object while maintaining the relationship between the particular displayed object and the position on the target element.03-21-2013
20130069862REMOTE MOVEMENT GUIDANCE - In one example of a user-guidance system, a remote control may be configured to guide a user's physical movements by transmitting movement signals that are to be translated into haptic instructions, and cooperative actuators may be configured to be worn by the user to translate the movement signals received from the remote control into the haptic instructions. The movement signals may be translated into the haptic instructions for physical movements of any limb or extremity of the user in either of a vertical direction or a horizontal direction; after the first movement signal, the movement signals may be transmitted prior to completion of an immediately previous physical movement; and the movement signals may include horizontal and vertical directional signal components that indicate the horizontal and vertical direction for the user's next physical movement. The haptic instructions that are translated from the horizontal and vertical directional signal components may differ in either duration or magnitude. The movement signals may include horizontal directional signal components that indicate the horizontal direction for the user's next physical movement, and the haptic instructions that are translated from the horizontal directional signal components may differ in either duration or magnitude.03-21-2013
20130069858Adaptive communications system - This invention allows a system to monitor how quickly and accurately the user is responding via the input device. The input device can be a mouse, a keyboard, their voice, a touch-screen, a tablet PC writing instrument, a light pen or any other commercially available device used to input information from the user to the PBCD. Information is displayed on the PBCD screen based on how quickly and accurately the user is navigating with the input device.03-21-2013
20130069859ELASTIC CONTROL DEVICE AND APPARATUS - An apparatus including at least one processor, and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to receive deformation information from an elastic control device operated by a user. The apparatus is further configured to determine a control signal for the apparatus based on the deformation information, and performing a function associated to the control signal.03-21-2013
20130088421DISPLAY CONTROL DEVICE - Provided is a display control device including a communication unit that performs communication with an external device, a detecting unit that detects a direction of an operation device based on detection information representing motion of the operation device, and a display control unit that generates an imaging control command used to control an imaging operation based on the detected direction of the operation device, and controls display of an image obtained through an imaging operation by causing the communication unit to transmit the imaging control command to the external device.04-11-2013
20130088424DEVICE AND METHOD FOR PROCESSING VIRTUAL WORLDS - A device and method for processing virtual worlds. According to embodiments of the present disclosure, information which is measured from the real world using characteristics of a sensor is transferred to a virtual world, to thereby implement an interaction between the real world and the virtual world. The disclosed device and method for processing virtual worlds involve selectively transferring information, from among the measured information, which is different from previously measured information. The disclosed device and method for processing virtual worlds involve transferring the entire measured information in the event that the measured information is significantly different from the previously measured information and for selectively transferring information, from among the measured information, which is different from the measured information in the event that the difference is not significant.04-11-2013
20130088423KEY INPUT APPARATUS FOR PORTABLE TERMINAL - A key input apparatus and a portable terminal having the key input apparatus are provided. The key input apparatus includes a main board, a display having flexibility and elasticity and at least one pressing switch placed between the main board and the display.04-11-2013
20130088420METHOD AND APPARATUS FOR DISPLAYING IMAGE BASED ON USER LOCATION - A method and an apparatus for displaying an image based on a user location allowing a user to view a displayed image similar to original image although viewing it from any location are provided. The method includes receiving a location value indicating a location of a user while displaying an original image, transforming the original image based on the received location value, and displaying the transformed image.04-11-2013
20130088425APPARATUS AND METHOD OF DETECTING AN INPUT POSITION WITH DISPLAY PATTERN RECOGNITION - An apparatus and method are provided for detecting an input position using display pattern recognition. The apparatus includes an effective pattern area extractor for receiving an image of a display screen captured by a camera and extracting an effective pattern area for pattern recognition from the captured image of the display screen; a pattern recognizer for detecting subpixels included in the effective pattern area and identifying a plurality of holes included in each of the subpixels; and a display coordinate calculator for detecting an input position based on points at which the plurality of holes included in the each of the subpixels are formed.04-11-2013
20130088426GESTURE RECOGNITION DEVICE, GESTURE RECOGNITION METHOD, AND PROGRAM - There is provided a gesture recognition device that recognizes a gesture of shielding the front side of the imaging sensor, the gesture recognition device including a first detection unit that detects a change in a captured image between a state in which a front side of an imaging sensor is not shielded and a state in which the front side of the imaging sensor is shielded, and a second detection unit that detects a region in which a gradient of a luminance value of the captured image is less than a threshold value in the captured image in the state in which the front side of the imaging sensor is shielded.04-11-2013
20130088419DEVICE AND CONTROL METHOD THEREOF - A device and a control method for the device are disclosed. A device and a control method for the device according to the present invention comprises a sensing unit; and a controller, if at least one of a second control command is received through the sensing unit while carrying out a control operation based on at least one of a first control command received through the sensing unit, generating a display signal based on a control command selected according to a predetermined criterion from the received multiple control commands of the first and the second control command. According to the present invention, a control command for generating a display signal can be effectively selected in the case that another control command is received while a particular control command is carried out.04-11-2013
20130088422INPUT APPARATUS AND INPUT RECOGNITION METHOD - Provided is an input apparatus, including: an infrared camera; an image capture unit configured to sequentially capture a plurality of temperature distribution images photographed at predetermined time intervals by the infrared camera; and an input recognition unit configured to detect, from among the plurality of temperature distribution images captured by the image capture unit, pairs of skin temperature image portions each corresponding to a temperature of skin of a person, recognize, from among the pairs of skin temperature image portions thus detected, pairs of skin temperature image portions as pairs of detection target images, from among which motions are observed, and recognize an operation input based on states of the motions of the pairs of detection target images.04-11-2013
20100134408Fine-motor execution using repetitive force-feedback - An individual's fine-motor skills can be assessed using a force-feedback haptic unit that includes an end-effecter and programmable settings. To assess these skills, a tangible computer readable medium initializes the programmable settings with a set of initial settings. It then presents a 3-D representation of a character or characters to a user. The user in turn is prompted to mimic the character(s) on a work space. While the user is attempting to mimic the character(s) using the end-effecter on the work space, timed stroke data are collected from the force-feedback haptic unit. Using the timed stroke data, an analysis is then generated to determine the user's precision and accuracy of mimicking the character.06-03-2010
20100134412Information Processing Apparatus and Information Processing Method - An information processing apparatus is provided which includes a first display unit for displaying book data per page, a second display unit provided so as to be adjacent to the first display unit for displaying the book data per page, an axial part provided between the first and second display units, a first detecting unit for detecting rotational angular displacement of the first display unit around the axial part, a second detecting unit for detecting rotational angular displacement of the second display unit around the axial part, and a display page controller for displaying a page N displayed on the first display unit on the second display unit, or displaying the page N displayed on the second display unit on the first display unit, based on the rotational angular displacement of the first display unit and the rotational angular displacement of the second display unit.06-03-2010
20100134409THREE-DIMENSIONAL USER INTERFACE - The instant invention provides an apparatus, method and program storage device enabling a three-dimensional user interface for the movement of objects rendered upon a display device in a more realistic and intuitive manner. A Z distance is set whereupon a user crossing the Z distance is enabled to select an object, i.e. pick it up. As the user breaks the Z distance again, the object selected will move with the user's hand. As the user breaks the Z distance once more, the object will be released, i.e. dropped into a new position.06-03-2010
20100265169Helmet Comprising Visor Position Detection and Associated Helmet Position Detection - The general field of the invention is that of helmet position detection systems for aircraft including a helmet comprising a substantially spherical mobile visor that is able to occupy a position of use and a stowage position, the visor being arranged in front of the pilot's eyes in the position of use. The helmet according to the invention comprises detection means for detecting the position of the visor and the helmet position detection system comprises means making it possible to introduce an angular correction due to the mobile visor in the orientation measurements when the visor is detected in the position of use by the detection means. The detection means are preferably optical sensors.10-21-2010
20090309828Methods and systems for transmitting instructions associated with user parameter responsive projection - The present disclosure relates to systems and methods that are related to transmitting and receiving instructions associated with user parameter responsive projection.12-17-2009
20090303179Kinetic Interface - A kinetic interface for orientation detection in a video training system is disclosed. The interface includes a balance platform instrumented with inertial motion sensors. The interface engages a participant's sense of balance in training exercises.12-10-2009
20090303178DISPLAY CONTROL APPARATUS, DISPLAY CONTROL METHOD, AND PROGRAM - There is provided a display control apparatus including an external memory accommodating unit for accommodating a removable external memory; a database recognizing unit for recognizing a database stored in the external memory, the database being recorded with an image stored in the external memory and information related to the image in correspondence to each other; a display method setting unit for setting either display method of a stored first display method or a second display method of displaying the image stored in the external memory without using the database based on a recognition result of the database recognizing unit; and a display controlling unit for displaying the image stored in the external memory by the display method based on the set display method.12-10-2009
20090303177ELECTRONIC DEVICE AND METHOD FOR SELECTING FUNCTIONS BASED ON ORIENTATIONS - The present invention provides an electronic device and a method for selecting functions based on orientations of the electronic device. The method includes: a) storing relationships between orientations and functions; b) fetching inductive signals based on the orientation of the electronic device; c) recognizing the current orientation according to the fetched signals; and d) if the orientation is altered, selecting a function corresponding to the altered orientation and displaying a corresponding interface.12-10-2009
20090303175Haptic user interface - This invention relates to a method, apparatuses and a computer-readable medium having a computer program stored thereon, the method, apparatuses and computer program using a haptic signal perceptible by a user contacting a user interface surface with an input means (device) to indicate a predetermined direction on the user interface surface.12-10-2009
20090303174CONTROL OF DUAL FUNCTION INPUT AREA - An apparatus is provided that may be a portable device, such as a portable personal computer, cellular phone, or other computing device. The apparatus includes an input device. The input device may include a first input area and a second input area. The first and second input areas may occupy the same area of the computing device, such as the second input area may be underneath the first input area. A controller is provided that includes a first control area and a second control area. The first and second control areas may also occupy the same area where second control area may be underneath first control area. In one embodiment, first control area may be responsive to a mechanical input and a second control may be responsive to a non-mechanical input. The second control area is configured to enable and disable the second input area of the input device.12-10-2009
20090303176METHODS AND SYSTEMS FOR CONTROLLING ELECTRONIC DEVICES ACCORDING TO SIGNALS FROM DIGITAL CAMERA AND SENSOR MODULES - An embodiment of a method for remotely controlling an electronic apparatus, performed by a processor of the electronic apparatus, comprises the following steps. Existence of an object in close proximity to the electronic apparatus is detected. A camera module of the electronic apparatus is turned on to capture a series of images. A control operation in response to the captured images is determined. The control operation is performed to an electronic device of the electronic apparatus.12-10-2009
20120218182APPARATUS FOR RECOGNIZING THE POSITION OF AN INDICATING OBJECT - The present invention relates to an apparatus for recognizing the position of an indicating object. An apparatus for recognizing the position of an indicating object of the present invention comprises: first reflecting means installed along the left, right, and bottom edges of a screen so as to reflect a laser beam emitted from object-detecting means back to the object-detecting means; said object-detecting means, formed as a pair, for analyzing a change in the amount of light in the reflected laser beam over time, and detecting position coordinates of the indicating object on the planar surface of the screen; and fixing means including a housing and a fixing member fixedly installed on an upper portion of the screen and coupled to the housing so as to fix the housing to the upper portion of the screen, the fixing means being intended for facilitating the installation of the object-detecting means on the upper portion of the screen. The apparatus of the present invention is an apparatus for recognizing the position of an indicating object that contacts a screen, wherein the apparatus is easy to transport and store, can protect the object-detecting means from dust and impurities, can easily be installed by a layperson having no expert knowledge, and can be installed without any restrictions in terms of screen size.08-30-2012
20120218181CAMERA BASED SENSING IN HANDHELD, MOBILE, GAMING OR OTHER DEVICES - Method and apparatus are disclosed to enable rapid TV camera and computer based sensing in many practical applications, including, but not limited to, handheld devices, cars, and video games. Several unique forms of social video games are disclosed.08-30-2012
20120218180DISPLAY CONTROL DEVICE AND DISPLAY CONTROL METHOD - A display control device acquires the size of a picture added in advance to picture data and an optimum viewing distance corresponding to the picture. The display control device calculates a viewing distance, in which the relationship between the size of the picture and the optimum viewing distance is equal to the relationship between the display size and the viewing distance, as a recommended viewing distance based on the size of the picture, the optimum viewing distance, and the display size of the picture. Then, the display control device notifies the calculated recommended viewing distance to a user.08-30-2012
20120218179DISPLAY APPARATUS AND CONTROL METHOD - A display apparatus and a control method capable of preventing its user from viewing an image in an improper viewing position. The display apparatus includes: an imaging unit that captures a moving image in a predetermined range with respect to an image display direction; an image analyzer that analyzes the moving image captured by the imaging unit, and calculates a position of a target that should be guided to a proper viewing position; and a display controller that causes a display unit to perform display to guide the target to the proper viewing position when the target position calculated by the image analyzer is at an improper viewing position.08-30-2012
20120218178CHARACTER INPUT DEVICE - The present invention provides a character input device that can display character input information on a display screen with a small area and allows a user to input a desired character by a simple operation even in a case where it is necessary to select the desired character from plural characters to input the desired characters.08-30-2012
20120218177METHOD AND APPARATUS FOR PROVIDING DIFFERENT USER INTERFACE EFFECTS FOR DIFFERENT MOTION GESTURES AND MOTION PROPERTIES - A method for providing a mechanism by which different user interface effects may be performed for different motion events may include receiving an indication of a motion event at a motion sensor, determining a motion gesture of the motion event, determining a motion property of the motion event, and enabling provision a user interface effect based on the motion property of the motion event. A corresponding apparatus and computer program product are also provided.08-30-2012
20110006980DATA INPUT DEVICE, DATA INPUT METHOD, DATA INPUT PROGRAM, AND RECORDING MEDIUM CONTAINING THE PROGRAM - A data input device that enables the user to execute a scrolling operation rapidly and infallibly without enduring any operational burden is provided.01-13-2011
20130057468DATA OUTPUT DEVICE, DISPLAY DEVICE, DISPLAY METHOD AND REMOTE CONTROL DEVICE - Unit data, which constitutes digital data is extracted as 8-bit units of parallel data by a control section and outputted to a buffer. Thereafter, in a process where the unit data is transmitted from a serial interface to a display unit, the unit data is converted to parallel data in a format required by the display unit. Thereby, it becomes unnecessary for the control section to perform processing of converting the digital data to a format required by the display unit. Consequently, the load on the control section is reduced.03-07-2013
20130057466PROJECTION SYSTEM, PROJECTION APPARATUS, SENSOR DEVICE, POWER GENERATION CONTROL METHOD, AND COMPUTER PROGRAM PRODUCT - A projection system includes a projector that projects an image onto a projection plane of a screen, and sensor units each including a photoelectric power generating unit that is installed at a predetermined position in an area in which an image is projected on the projection plane and that generates power corresponding to an intensity of projection light projected by the projector. The projector may include an image processing circuit that converts at least image data projected at the installation positions of the sensor units in image data projected on the projection plane into white image data or converts whole image data into white image data.03-07-2013
20130057469GESTURE RECOGNITION DEVICE, METHOD, PROGRAM, AND COMPUTER-READABLE MEDIUM UPON WHICH PROGRAM IS STORED - The present invention provides a gesture recognition device which can accurately recognize a user's gesture in a free space with a simple configuration, and which is mounted on a processing unit and which causes the processing unit to execute an operation corresponding to the recognized gesture. The gesture recognition device (03-07-2013
20130057467COMMUNICATIONS WITH A HAPTIC INTERFACE DEVICE FROM A HOST COMPUTER - The present invention comprises methods and apparatuses that can provide reliable communications between a computer and a haptic interface device. The methods and apparatuses can provide communication that is more secure against errors, failures, or tampering than previous approaches. Haptic devices allow a user to communicate with computer applications using the user's sense of touch, for example by applying and sensing forces with the haptic device. The host computer must be able to communicate with the haptic device in a robust and safe manner. The present invention includes a novel method of accomplishing such communication; a computer-readable medium that, when applied to a computer, causes the computer to communicate according to such a method; and a computer system having a host computer and a haptic device communicating according to such a method.03-07-2013
20130057465IMAGE DISPLAY APPARATUS, REMOTE CONTROLLER, AND METHOD FOR OPERATING THE SAME - A method for operating an image display apparatus is discussed. The method includes displaying an InfraRed (IR) blaster menu screen, receiving a selection input for selecting one of electronic devices included in the IR blaster menu screen, and transmitting IR format key information about the selected electronic device or device information about the selected electronic device to a remote controller according to the selection input.03-07-2013
20090091530SYSTEM FOR INPUT TO INFORMATION PROCESSING DEVICE - New input systems, that is, a paper icon, a paper controller, a paper keyboard, and a mouse pad capable of inputting letters, characters or the like to a computer and performing operations with easy manipulation and replacing hardware devices such as a keyboard, a mouse, and a tablet are provided. By providing an icon formed on a medium for reading a dot pattern formed on a surface of the medium using a scanner connected to an information processing device, for converting the dot pattern into each of or one of a code value and a coordinate value defined by the dot pattern, and for outputting a voice, an image, a moving image, a letter or character or a program corresponding to each of or one of the code value and the coordinate value stored in the information processing device or for outputting information on an access to a website corresponding to each of or one of the code value and the coordinate value stored in the information processing device, it is possible to realize information on the voice, image, moving image or letter or character prepared in advance, start of the program, access to the website or the like.04-09-2009
20090091529Rendering Display Content On A Floor Surface Of A Surface Computer - Methods, apparatus, and products are disclosed for rendering display content on a floor surface of a surface computer, the floor surface comprised in the surface computer, the surface computer capable of receiving multi-touch input through the floor surface and rendering display output on the floor surface, that include: detecting, by the surface computer, contact between a user and the floor surface; identifying, by the surface computer, user characteristics for the user in dependence upon the detected contact; selecting, by the surface computer, display content in dependence upon the user characteristics; and rendering, by the surface computer, the selected display content on the floor surface.04-09-2009
20130093661METHODS AND APPARATUS FOR FACILITATING USER INTERACTION WITH A SEE-THROUGH DISPLAY - Methods and apparatus are provided in order to facilitate user interaction with an electronic device, such as a see-through display. In the context of a method, a reference plane may be determined based upon a fiducial marker presented upon a display of a mobile terminal. The method may also cause interaction information to be displayed in relation to the reference plane such that the interaction information appears to be presented upon the display of the mobile terminal and such that the interaction information at least partially occludes a user's view of the fiducial marker presented upon the display of the mobile terminal. The method may also include receiving input responsive to user input to the mobile terminal in relation to the interaction information and causing performance of an operation in response to the input based upon the interaction information.04-18-2013
20100127969Non-Contact Input Electronic Device and Method Thereof - A non-contact input electronic device and a method thereof are disclosed. The non-contact input electronic device includes a display module, sensors and a control module. A plurality of sensors are disposed around the display module, respectively, to sense a non-contact input motion. The control module is coupled to the sensors, and it senses a moving order of the non-contact input according to the sensors, generates an input command corresponding to the moving order and executes the input command.05-27-2010
20090251409MOBILE WIRELESS DISPLAY SOFTWARE PLATFORM FOR CONTROLLING OTHER SYSTEMS AND DEVICES - A wireless headset can incorporate a wireless communication controller that not only provides a video link to a host device, but also provides for control and management of a host device and other more devices. In this context, a host device may be any appropriate device that sources audio, video, text, and other information, such as a cell phone, personal computer, laptop, media player, and/or the like.10-08-2009
20100097310TERMINAL AND CONTROLLING METHOD THEREOF - A terminal having a video communication function includes camera configured to capture image data, a memory configured to stored the captured image data, a wireless communication unit configured to permit a video communication during which the image data is communicated to a correspondent terminal, a display configured to display a user interface during the video communication, and a controller configured to store in the memory, responsive to termination of the video communication with the correspondent terminal, any portions of configuration setting information for the user interface that has been modified during the video communication.04-22-2010
20100097309INFORMATION PROCESSING APPARATUS AND COMPUTER-READABLE RECORDING MEDIUM RECORDING INFORMATION PROCESSING PROGRAM - Motion information is obtained which is information about a motion applied to an input device housing itself including a pointing device of a plurality of input mean. Next, based on the motion information, a movement amount of the input device housing is calculated. Thereafter, it is determined whether or not the movement amount satisfies predetermined conditions. When the predetermined conditions are satisfied, a position is designated based on an output from the pointing device.04-22-2010
20090267894OPERATIONAL OBJECT CONTROLLING DEVICE, SYSTEM, METHOD AND PROGRAM - An operational object controlling device including a motion detection unit, a motion obtaining unit, a motion feature quantities extraction unit, a template storage unit, an operational object motion storage unit, a motion feature quantities transform unit and an operational object motion obtaining unit. The motion obtaining unit obtains the user's motion detected by the motion detecting unit. The motion feature quantities extraction unit extracts the user's motion feature quantities from the obtained motion. The transform unit transforms the motion feature quantities by using a template obtained from the template storage unit. The motion feature quantities of the operational object are obtained from each of the temporal motion sequences of the operational object in the operational object motion storage unit. The operational object motion obtaining unit obtains one of the temporal motion sequences from the storage unit having the feature quantities close to the user's motion feature quantities.10-29-2009
20090267893TERMINAL DEVICE - An operation estimating portion 10-29-2009
20090267892SYSTEM AND METHOD FOR GENERATING ENERGY FROM ACTIVATION OF AN INPUT DEVICE IN AN ELECTRONIC DEVICE - In this disclosure, a description of a system for providing feedback signals relating to input signals provided to an electronic device is provided. The system comprises: an input device; a transducer associated with the input device; and a feedback module to generate a feedback signal indicating activation of the input device on the electronic device based on signals from the input device. The input device may be a touchpad; the transducer may be a piezoelectric element; and the feedback module may cause the transducer to vibrate upon receiving an activation signal relating to activation of the input device. The feedback module may provide a voltage generated by the transducer during the activation of the input device to an energy storage circuit.10-29-2009
20110012828PROTRUSION PATTERN FORMING DEVICE WITH DISPLAY FUNCTION - A protrusion pattern forming device with a display function includes a transparent elastic sheet having an internal layer including colored liquid, and an actuator including a plurality of actuator elements disposed along one surface of the elastic sheet, each of the actuator elements changing its own shape in response to an application of a voltage, thereby allowing the surface of the elastic sheet to protrude. A protrusion pattern is formed on the elastic sheet, and a dot pattern corresponding to the protrusion pattern is displayed on the elastic sheet by selectively driving the plurality of actuator elements.01-20-2011
20110012827Motion Mapping System - A motion mapping system includes a motion sensing device and a receiving device. The motion sensing device may include an accelerometer, a rotational sensor, a microcontroller, and an RF transmitter. The microcontroller may output processed motion data to the receiving device. The receiving device may include an RF receiver, a microprocessor, and a Universal Serial Bus interface for connection to a computer. The receiving device's microprocessor may output the processed motion data to motion mapping software. The motion mapping software may map the motion data to a corresponding predetermined input event defined by the motion mapping software and transmit a control signal back to the receiving device's microprocessor indicating the corresponding predetermined input event. Upon reception of the control signal from the mapping software, the receiving device's microprocessor may generate a hardware input event according to the control signal and transmits the generated hardware input event back to the computer.01-20-2011
20130063337APPARATUS AND METHOD FOR PROJECTOR NAVIGATION IN A HANDHELD PROJECTOR - A method and apparatus for navigating a projected image. The method includes projecting, by a projector, an image on a surface, the projected image comprising a first portion of a virtual image. The method also includes determining a movement of the projector, and in response to the movement of the projector, changing the projected image by projecting a second portion of the virtual image different from the first portion.03-14-2013
20120223881DISPLAY DEVICE, DISPLAY CONTROL CIRCUIT, AND DISPLAY CONTROL METHOD - In a display control circuit 09-06-2012
20120223880METHOD AND APPARATUS FOR PRODUCING A DYNAMIC HAPTIC EFFECT - A system that produces a dynamic haptic effect and generates a drive signal that includes two or more gesture signals. The haptic effect is modified dynamically based on the gesture signals. The haptic effect may optionally be modified dynamically by using the gesture signals and two or more real or virtual device sensor signals such as from an accelerometer or gyroscope, or by signals created from processing data such as still images, video or sound.09-06-2012
20120223879Autostereoscopic Display and Method for Operating the Same - An autostereoscopic display is described, which comprises a display panel, a parallax barrier that is arranged on a viewing side of the display panel, wherein the parallax barrier comprises a first e-ink plane having a plurality of e-ink particles that are pivotable around a tilt axis by a tilting angle and wherein the e-ink particles are light-transmissive in a first direction and opaque in a second direction, and a control unit that is configured to tilt the e-ink particles so as to set the e-ink particles to a common tilting angle and to synchronize a reproduction of a picture by the display panel and the tilt of the e-ink particles so as to provide a stereoscopic view to a user.09-06-2012
20120223878Information Terminal and Portable Information Terminal - This information terminal includes a control portion displaying a subsequent input item to an input item into which information has been input in a selected state where a user can input information on a display portion on the basis of a prescribed order of a plurality of input items when the user inputs the information into the input item in the selected state where information can be input, of the plurality of input items.09-06-2012
20130162519CROSS-PLATFORM HUMAN INPUT CUSTOMIZATION - An input handler may receive first human input events from at least one human input device and from at least one user, associate the first human input events with a first identifier, receive second human input events from the at least one human input device from the at least one user, and associate the second human input events with a second identifier. A command instructor may relate the first human input events and the second human input events to commands of at least one application, and instruct the at least one application to execute the commands including correlating each executed command with the first identifier or the second identifier06-27-2013
20130162520GESTURE DETECTION AND COMPACT REPRESENTATION THEREOF - Techniques are described that may be implemented with an electronic device to detect a gesture within a field of view of a sensor and generate a compact data representation of the detected gesture. In implementations, a sensor is configured to detect a gesture and provide a signal in response thereto. An estimator, which is in communication with the sensor, is configured to generate an elliptical representation of the gesture. Multiple coefficients for the compact representation of the gesture can be used to define the ellipse representing the gesture.06-27-2013
20090195498Signal Generator Providing ISI Scaling to Touchstone Files - A device and method for producing Inter Symbol Interference (ISI) scaling of S-Parameter Touchstone files for the generation of ISI scaling effects on serial data patterns by direct digital synthesis is described. The features of the present invention allow user to set parameters such as data rate, voltage amplitude, encoding scheme etc. as per requirements for the serial data patterns. An ISI scaling value is selected and applied to an S-Parameter Touchstone file representing transmission path effects. The serial data pattern parameters and the ISI scaling value used with the S-Parameter Touchstone file are compiled to generate a digital data waveform record file. The digital waveform record file is applied to a waveform generation circuit for converting the digital data into an analog serial data pattern with ISI scaling effects.08-06-2009
20110063208METHOD AND SYSTEM FOR CONVEYING AN EMOTION - The present invention relates to a method for conveying an emotion to a person being exposed to multimedia information, such as a media clip, by way of tactile stimulation using a plurality of actuators arranged in a close vicinity of the person's body, the method comprising the step of providing tactile stimulation information for controlling the plurality of actuators, wherein the plurality of actuators are adapted to stimulate multiple body sites in a body region, the tactile stimulation information comprises a sequence of tactile stimulation patterns, wherein each tactile stimulation pattern controls the plurality of actuators in time and space to enable the tactile stimulation of the body region, and the tactile stimulation information is synchronized with the media clip. An advantage with the present invention is thus that emotions can be induced, or strengthened, at the right time (e.g. synchronized with a specific situation in the media clip).03-17-2011
20110063207DISPLAY APPARATUS AND CONTROL METHOD THEREOF AND PROJECTION APPARATUS AND CONTROL METHOD THEREOF - A display apparatus including at least one sensor, a processing unit and a control unit is provided. The sensor senses a human body. The processing unit is coupled to the sensor, and captures and processes signals from the sensor to obtain a sensing data. The control unit is coupled to the processing unit, and analyzes and determines whether the human body is to enter into or to be distant from a sensing range of the sensor according to the sensing data. When the control unit determines that the human body is to be distant from the sensing range for a first predetermined time, the control unit turns off the display apparatus or makes the display apparatus get into a power-saving/sleeping mode. When the control unit determines that the human body is to enter into the sensing range for a second predetermined time, the control unit turns on the display apparatus.03-17-2011
20110063206SYSTEM AND METHOD FOR GENERATING SCREEN POINTING INFORMATION IN A TELEVISION CONTROL DEVICE - A system and method, in a television control device, for generating screen pointing information, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims.03-17-2011
20110063205DISPLAY MAGNIFIER - Disclosed is a display element that includes a display screen and a transparent cover overlying the display screen. The output from the display screen is magnified to occupy at least some of the transparent cover. The display element includes an arrangement of diffractive and refractive elements disposed between the display screen and the transparent cover that provide significant magnification without adding significant thickness to the display element. In some embodiments, the diffractive and refractive elements are embodied in a number of optical microelements, in some cases at least one optical microelement per pixel of the display screen. Some embodiments include microlenses that collimate the light emitted by the display screen. The display element can be “anamorphotic,” that is, the magnification in one direction differs from that in another. Some embodiments provide at least two levels of magnification. Liquid matching switches can be used to control the level of magnification.03-17-2011
20130063343SENSOR MAPPING - Techniques, systems and computer program products are disclosed for providing sensor mapping. In one aspect, a method includes receiving input from a user. The received input includes at least one of motion, force and contact. In addition, a sensor signal is generated based on the received input. From a choice of data structures a data structure associated with a selected application having one or more functions is identified. The data structure indicates a relationship between the generated sensor signal and the one or more functions of the selected application. The generated sensor signal is selectively mapped into a control signal for controlling the one or more functions of the selected application by using the identified data structure.03-14-2013
20130063336VEHICLE USER INTERFACE SYSTEM - A driver can point at a vehicle display using a hand on the steering wheel. The vehicle display may be located in the dashboard behind the steering wheel. The location on the display at which the driver is pointing is determined using sensors, and a cursor is displayed at this location. Finger movement is detected by the sensors and a user interface function is performed in response. The performed user interface functions may include the movement of the displayed cursor on the vehicle display, the display of additional vehicle information, the launching of an application, the interaction with an application, and the scrolling of displayed information.03-14-2013
20130063342HUMAN INTERFACE INPUT ACCELERATION SYSTEM - A method and system for transmitting data to and from a hand-held host device are disclosed. An accessory device for interfacing with a host device includes a communication channel designed to establish a bidirectional data link between the accessory device and the host device. The accessory device also includes a storage unit communicatively coupled to the communication channel. The storage unit is designed to store various data. In addition, at least a first data is selectively transmitted from the stored data of the accessory device to the host device through the established bidirectional data link.03-14-2013
20130063345GESTURE INPUT DEVICE AND GESTURE INPUT METHOD - A gesture input device includes: a coordinate input detecting unit which sequentially detects coordinate set sequences of a user hand position; a gesture start detecting unit which detects a component indicating a first hand movement for starting a gesture, from a detected first coordinate sequence; a guide image generating unit which generates a gesture guide image for guiding the user to make a gesture including a second hand movement, when the first hand movement component is detected; an intended action component detecting unit which detects a second hand movement component as an intended action component, from a second coordinate sequence detected after the gesture guide image is displayed on the display screen; and a control signal generating unit which detects a component indicating a hand movement corresponding to the gesture from the second coordinate sequence when the intended action component is detected, and generate a control signal according to the detection result.03-14-2013
20130063344METHOD AND DEVICE FOR THE REMOTE CONTROL OF TERMINAL UNITS - For the remote control of terminal units of consumer electronics and of computers by way of a remote control with integrated motion sensors, it is suggested to process the detected motion sequences in the remote control itself and interpret them as gestures to the effect that certain gestures correspond to certain commands which are transmitted from the remote control directly to the corresponding terminal unit or to a computer. (03-14-2013
20130063341DISPLAY DEVICE ADJUSTMENT SYSTEM, AND ADJUSTMENT METHOD - A system controller comprises a normal mode and a maintenance mode as its control modes, and switches the mode so that electronic control buttons are assigned the functions of calling a crew member, onboard communications, switching seat lighting on or off at each seat, etc., in normal mode, and the picture quality and brightness of a large monitor are adjusted in maintenance mode.03-14-2013
20130063339APPARATUS FOR SELECTING MULTIMEDIA INFORMATION - An apparatus for selecting multimedia information containing an operating unit that comprises an actuation device that is movable back and forth into and out of a neutral position causing an output signal depending on the shift position of the actuation device, which causes movement of display unit elements in a scrolling manner where the elements are represented one after the other on a display unit in such a manner that the respective forward element covers the subsequent elements only partially, and the respective rearward elements are smaller than the respective forward elements.03-14-2013
20130063340EYE TRACKING CONTROL OF VEHICLE ENTERTAINMENT SYSTEMS - An in-flight entertainment system includes a video display unit facing a user seat. The video display unit includes a display surface that displays images to a user who is seated on the user seat. A light emitter illuminates eyes of the user. A camera outputs a video signal containing reflections from the illuminated eyes. A processor processes the video signal to determine a viewing location on the display surface at which the eyes are directed, and controls at least one function for how images are displayed on the display surface responsive to the determined viewing location.03-14-2013
20130063338DISPLAY WITH SCREEN CAPTURE FUNCTION - A display includes an input interface, a circuit board, a display panel, and an external memory. The input interface is electrically connected to a computer, and receives an image signal from the computer. The image signal includes a number of frames of image data. The circuit board includes a main control chip, an image processing module, and a recall button. The main control chip drives the display panel to display the frames of image data sequentially. If the computer freezes, the main control chip will control the display panel to repeatedly display a final frame of image data. When the recall button is pressed, the image processing module codes/decodes a frame of image data currently displayed on the display panel to obtain image data in a predetermined format which is stored in the external memory.03-14-2013
20130162525METHOD AND APPARATUS FOR PERFORMING MOTION RECOGNITION USING MOTION SENSOR FUSION, AND ASSOCIATED COMPUTER PROGRAM PRODUCT - A method and apparatus for performing motion recognition using motion sensor fusion and an associated computer program product are provided, where the method is applied to an electronic device. The method includes the steps of: utilizing a plurality of motion sensors of the electronic device to obtain sensor data respectively corresponding to the plurality of motion sensors, the sensor data measured at a device coordinate system of the electronic device, wherein the plurality of motion sensors includes inertial motion sensors; and performing sensor fusion according to the sensor data by converting at least one portion of the sensor data and derivatives of the sensor data into converted data based on a global coordinate system of a user of the electronic device, to perform motion recognition based on the global coordinate system, in order to recognize the user's motion.06-27-2013
20130162516APPARATUS AND METHOD FOR PROVIDING TRANSITIONS BETWEEN SCREENS - An apparatus, method, and computer program product are described that determine a destination screen for display and provide for a visual transition between an origin screen and the destination screen based on a position of a user input and a direction of the movement component of the input. The origin screen may, for example, associate certain areas of the screen with certain destination screens, such that an input received in one area invokes one destination screen and an input received in another area invokes another destination screen. The destination screen may also be determined based on the direction of the movement component of the input. Thus, one of several destination screens may be accessible to the user and may be determined based on the characteristics of the input received.06-27-2013
20130162515METHOD FOR CHANGING DEVICE MODES OF AN ELECTRONIC DEVICE CONNECTED TO A DOCKING STATION AND AN ELECTRONIC DEVICE CONFIGURED FOR SAME - The present disclosure provides a docking station for docking one or multiple portable electronic devices, such as a tablet and a mobile telephone or smartphone. The present disclosure provides a method for changing device modes of an electronic device connected to a docking station, as well as an electronic device and a docking station configured for same.06-27-2013
20090237354OPTICAL APPARATUS AND OPTICAL SYSTEM - An optical apparatus has a projection device that projects a projection image onto a projection area, and a capture device that is arranged so that the position of a principal point of the projection device optically corresponds with that the position of a principal point of the capture device, and captures the projection area. At least one of the projection device and the capture device has an optical system with a tilt angle, and the projection device and the capture device are arranged so that a line passing the center of an angle of field of the projection device and a line passing the center of an angle of field of the capture device correspond with each other.09-24-2009
20090237353Computer display capable of receiving wireless signals - A computer display capable of receiving wireless signals include a computer display provided with an antenna assembly for receiving wireless digital and analog A/V signals; and a female connector provided in the display for connecting the signals received via the antenna assembly to an external demodulating device, which is connected at a male connector thereof to the female connector in the display. The signals received by the antenna assembly is demodulated and converted by the external demodulating device, and transmitted to a personal computer via a transmission device for operation. The external demodulating device and the antenna assembly have a uniform specification for use with a computer display having the same specification to demodulate wireless signals.09-24-2009
20130069865REMOTE DISPLAY - A remote display system including a portable display that wirelessly receives data and power from a primary station. The primary station, which is remote from and without a tangible connection with the portable display, includes a data transmitting element and a power transmitting element. The portable display includes a power receiving element that receives power wirelessly from the power transmitting element and a data receiving element operable to receive data from the data transmitting element.03-21-2013
20090027331METHOD AND APPARATUS FOR DISPLAYING STATE OF APPARATUS - A method of displaying a state of an apparatus having a user interface which includes generating state display information indicating the state of the apparatus and displaying the generated state display information in the user interface. The state display information indicates the state of the apparatus through a metaphorical indicator.01-29-2009
20130162513USER GESTURE RECOGNITION - An apparatus comprising: at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus at least to perform: determining at least one parameter dependent upon a user gesture wherein the parameter is rotation invariant, having the same value when determined at different arbitrary orientations between the apparatus and the user gesture; and using the at least one parameter to determine an action in response to the user gesture.06-27-2013
20130162518Interactive Video System - An interactive video system includes an image capturing device, for example a video camera, for capturing user motion, and a graphical display which is arranged to be altered in response to detection of user motion as captured by the image capturing device. A user interface is arranged to display a visual representation of the motion detected by the system to assist in calibrating the system in relation to a surrounding environment.06-27-2013
20130162521DEVICE AND METHOD FOR USER INTERACTION - Disclosed are a device for user interaction with a combined projector and camera and a method and a device for user interaction for recognizing an actual object to augment relevant information on a surface or a periphery of the actual object. The device for user interaction, includes: at least one projector-camera pair in which a projector and a camera are paired; a motor mounted in the projector-camera pair and configured to control a location and a direction of the projector-camera pair; and a body including a computer capable of including a wireless network and configured to provide connection with an external device, and a projection space and a photographing space of the projector-camera pair overlap each other.06-27-2013
20080303789Method and Apparatus for Compensating for Position Slip in Interface Devices - Method and apparatus for compensating for position slip in interface devices that may occur between a manipulandum and a sensor of the device due to a mechanical transmission. A device position delta is determined from a sensed position of a manipulandum of an interface device. It is determined if position slip has occurred caused by a change in position of the manipulandum that was not sensed by a sensor of the interface device, typically caused by a mechanical transmission between sensor and manipulandum. If position slip has occurred, an error in the sensed position caused by the position slip is corrected by adjusting the sensed position to take into account the position slip. The adjusted position delta is used as the position of the manipulandum and the display of objects controlled by the interface device are accordingly compensated.12-11-2008
20080303787Touch Screen Apparatus And Methods - A touch screen computing apparatus, methods, and software product are provided. In one embodiment, an the computing apparatus provides a plurality of regions on a touch screen that are mapped to functions. In some embodiments, the regions include a keyboard region, a game control region, a mouse region and a stylus region. In some embodiments, the regions are configurable by the processor and mapped with different functionality. This Abstract is provided for the sole purpose of complying with the Abstract requirement rules that allow a reader to quickly ascertain the subject matter of the disclosure contained herein. This Abstract is submitted with the explicit understanding that it will not be used to interpret or to limit the scope or the meaning of the claims.12-11-2008
20120112997USER INTERFACE METHOD AND APPARATUS FOR DATA FROM DATA CUBES AND PIVOT TABLES - Systems, methods, and computer readable media provide space-efficient user interfaces to data cubes and pivot table information. Because the user interfaces are more efficient in usage of display area, smaller displays can be used more effectively in reviewing such data. The user interfaces provide a multi-dimensional navigation approach among dimensions represented in the data, which allows users to more easily maintain context when reviewing large pivot table reports, and the like. Other user interface features that ease review of such reports on smaller devices also are disclosed.05-10-2012
20120112996INTERACTIVE POINTING DEVICE - An interactive pointing device having pointing function in space and game control is provided in the present disclosure. The interactive pointing device comprises an accelerometer module, a dual-axis gyroscope device, and a micro processing unit. The accelerometer module functions as sensing the movement of the operator and generates at least one axis of accelerating signal corresponding to the sensed movement. The dual-axis gyroscope device functions as sensing rotation status of the interactive pointing device about dual axis and generate a corresponding rotating signal. The micro processing unit processes the at least one axis of accelerating signal and the rotating signal so as to interact with an electronic device accordingly. The micro processing unit is able to use the rotating signal to compensate the accelerating signal and assist an action evaluation process of the interactive pointing device operating under different operation modes.05-10-2012
20120112995Information Processing Apparatus, Information Processing Method, and Computer-Readable Storage Medium - A method is provided for generating a command to perform a predetermined operation. The method comprises acquiring at least a first input and a second input from among a plurality of inputs. The method further comprises determining first semantic information associated with the first input. The method also comprises determining second semantic information associated with the second input. The method also comprises generating a command to perform a predetermined operation, based a combination of the determined first and second semantic information.05-10-2012
20120112994Interaction Techniques for Flexible Displays - The invention relates to a set of interaction techniques for obtaining input to a computer system based on methods and apparatus for detecting properties of the shape, location and orientation of flexible display surfaces, as determined through manual or gestural interactions of a user with said display surfaces. Such input may be used to alter graphical content and functionality displayed on said surfaces or some other display or computing system.05-10-2012
20120235902CONTROL SYSTEMS AND METHODS FOR HEAD-MOUNTED INFORMATION SYSTEMS - A head-mounted information system is provided, the head-mounted information system comprising a frame configured to be mounted on a head of a user, a display unit coupled to the frame, a sensor unit coupled to the frame comprising one or more motion sensors, and, a processor unit coupled to the frame and connected to receive signals from the motion sensors. The processor unit comprises a processor and a memory accessible by the processor. The processor unit is configured to monitor the received signals and enter a gesture control mode upon detection of a gesture control enable signal. In the gesture control mode the processor is configured to convert signals received from the motion sensors into menu navigation commands.09-20-2012
20120235901SYSTEM AND METHOD FOR CONTROL BASED ON FACE OR HAND GESTURE DETECTION - System and method for control using face detection or hand gesture detection algorithms in a captured image. Based on the existence of a detected human face or a hand gesture in an image captured by a digital camera, a control signal is generated and provided to a device. The control may provide power or disconnect power supply to the device (or part of the device circuits). The location of the detected face in the image may be used to rotate a display screen to achieve a better line of sight with a viewing person. The difference between the location of the detected face and an optimum is the error to be corrected by rotating the display to the required angular position. A hand gesture detection can be used as a replacement to a remote control for the controlled unit, such as a television set.09-20-2012
20120235900SEE-THROUGH NEAR-EYE DISPLAY GLASSES WITH A FAST RESPONSE PHOTOCHROMIC FILM SYSTEM FOR QUICK TRANSITION FROM DARK TO CLEAR - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content wherein the optical assembly comprises a photochromic layer and a heater layer disposed on a see-through lens of the optical assembly, wherein the photochromic layer is heated by the heater layer to accelerate its transition from dark to clear.09-20-2012
20120235899APPARATUS, SYSTEM, AND METHOD FOR CONTROLLING VIRTUAL OBJECT - An apparatus, system, and method for controlling a virtual object. The virtual object is controlled by detecting a hand motion of a user and generating an event corresponding to the hand motion. Accordingly, the user may control the virtual object displayed on a 3-dimensional graphic user interface (3D GUI) more intuitively and efficiently.09-20-2012
20120235897INFORMATION PROCESSING APPARATUS, AND CONTROL METHOD AND PROGRAM THEREFOR - One of the aspects of the disclosure is directed to displaying an image according to attribute information thereof in a display area having a time axis based on an item of predetermined attribute information, and when changing a display range on the time axis, allowing a user to easily designate a point to be a reference thereof. An information processing apparatus according to the present invention displays an image in the display area having the time axis according to date and time information of the image. The information processing apparatus moves a mouse cursor on the display area according to a user's operation, and sets a reference after the display range is changed.09-20-2012
20120235896BLUETOOTH OR OTHER WIRELESS INTERFACE WITH POWER MANAGEMENT FOR HEAD MOUNTED DISPLAY - A Head Mounted Display (HMD) system that includes a wireless front end that interprets spoken commands and/or hand motions and/or body gestures to selectively activate subsystem components only as needed to carry out specific commands.09-20-2012
20120235895CONTENT OUTPUT CONTROL DEVICE AND CONTENT OUTPUT CONTROL METHOD - When a content output apparatus (09-20-2012
20120235894SYSTEM AND METHOD FOR FOLDABLE DISPLAY - As described herein, there is provided methods, apparatus and computer program products to display visual information on a foldable display device. Display control signals are altered or modified to avoid display of or compensate for impairment in the display of visual information on fold deformations introduced in a display unit of the display device in response to folding of the display unit, or to reverse any alterations or modifications in the event that the fold deformations are eliminated by unfolding of the display unit.09-20-2012
20120235893SYSTEM AND METHOD FOR BENDABLE DISPLAY - Systems, methods and apparatus are described for displaying visual information on a deformable display device, and for compensating for distortion of images of the visual information that results from the deformation of the display device and the viewing orientation of a viewer of the display device, or for improving or enhancing the displayed visual information in response to the deformation of the display device and the viewing position of the viewer.09-20-2012
20120235892TOUCHLESS INTERACTIVE DISPLAY SYSTEM - A touchless interactive display system includes a display with a display area bounding the display. A reflective surface is located along an edge of the display. One optical sensor opposes and faces the reflective surface so that the optical sensor has a primary, non-reflected field of view and a secondary, reflected field of view that is reflected back from the reflective surface. The primary field of view covers a first portion of the display area that is less than the whole display area, and the reflected field of view covers a second portion of the display area, such that the first and second portions of the display area cover the entire display area. The optical sensor and a processor are operable to detect an object placed within at least one of its first and second fields of view without having the object touch the display.09-20-2012
20090153470ELECTRONIC DEVICE WITH MODULE INTEGRATING DISPLAY UNIT AND INPUT UNIT - An electronic device includes a housing, a main display unit, an auxiliary display module and a processor. The main display unit and the auxiliary display module are both disposed on the housing. The auxiliary display module includes an auxiliary display unit and an input unit. The auxiliary display unit is used for displaying a user interface, and the input unit is used for receiving a user input. The processor is coupled to the input unit, and performs a corresponding function according to the user input.06-18-2009
20130162514GESTURE MODE SELECTION - An apparatus, system, and method are disclosed for gesture mode selection. An apparatus for gesture mode selection includes a detection module, a gesture mode module, and a gesture recognition module. The detection module detects a triggering event. The gesture mode module sets a gesture mode from an idle mode to an enhanced mode based on the detection of the triggering event. The gesture recognition module processes data from a non-contact input device to detect gestures according to the gesture mode set by the gesture mode module.06-27-2013
20130162517Gesturing Architecture Using Proximity Sensing - Proximity based system and method for detecting user gestures. Each of a plurality of proximity sensing circuits may collect digital data. Each proximity sensing circuit may include an antenna configured to transmit and receive electromagnetic signals and a shield driver configured to shield signals transmitted by the antenna in one or more directions. The digital data may be collected based on electromagnetic signals received from another proximity sensing circuit via the antenna. The received electromagnetic signals may be modified by one or more user proximity gestures. The digital data from each of the plurality of proximity sensing circuits may be received by a coordinating circuit. The coordinating circuit may produce coordinated digital data from the digital data received from each of the plurality of proximity sensing circuits. The coordinated digital data may be configured for use in determining that a user performed the one or more user proximity gestures.06-27-2013
20130162522METHOD AND APPARATUS FOR CONTROLLING POWER OF DISPLAY DEVICE USING FRONTAL FACE DETECTOR - Disclosed are a method and an apparatus for controlling power of a display device using a frontal face detecting device. The method for controlling power of a display device includes: determining whether a face of a user is present in front of the display device; confirming a power state of the display device; and repeatedly performing the determining of whether the face of a user is present according to determined presence of the face and the confirmed power state to control power of the display device. The method intelligently recognizes a use purpose of a user from the display device, thereby managing power of the display device, and controlling power of the display device without requiring a separate action by a user for managing device power.06-27-2013
20130162524ELECTRONIC DEVICE AND METHOD FOR OFFERING SERVICES ACCORDING TO USER FACIAL EXPRESSIONS - A method for offering services according to facial expressions is provided. The method has an electronic device storing a service database recording at least one user's information. The method activates an offering service function; captures facial expressions of the user; extracting the features of the facial expressions; compares the extracted features with the features in images of the facial expressions stored in the service database, so as to identify a corresponding feature stored in the service database, and determines the type of expression and the service corresponding thereto from images of the user stored in the service database; and activates and provides the determined service. An electronic device using the method is also provided.06-27-2013
20130162526APPARATUS AND METHOD FOR REDUCING CURRENT CONSUMPTION IN PORTABLE TERMINAL WITH FLEXIBLE DISPLAY - A portable device with a flexible display and a method for displaying information in the portable device with a flexible display are provided. The portable device includes a sensor for detecting at least one of a user's face and an ambient brightness, the flexible display including a first area and a second area, and a controller for displaying information in the first area when at least one of the user's face is detected and the ambient brightness has a value greater than a threshold, and for displaying the information in the second area when at least one of the user's face is not detected and the ambient brightness has value less than the threshold.06-27-2013
20130162527INTERACTION WITH PORTABLE DEVICES - A portable electronic device comprises means for generating or receiving a signal indicating that it has been placed in a predetermined position. The portable electronic device is configured upon receiving said signal to enable user interaction with the portable electronic device by a touchless interaction mode. Also disclosed are various arrangements of a peripheral device and a portable electronic device which cooperate to provide a touchless interaction mode—e.g. through transducers and/or processing means being provided in the peripheral device.06-27-2013
20130207890METHODS DEVICES AND SYSTEMS FOR CREATING CONTROL SIGNALS - A interface comprising a hand operated input device with a series of activation points activated by the digits (fingers and/or thumb) of a user; a sensor component measuring a current motion, orientation, and/or position of the input device and a output component interconnected to the activation points and the sensor component for outputting in a series the currently active activation points and the current motion, orientation, and/or position of the input device.08-15-2013
20130207889System and Method of Biomechanical Posture Detection and Feedback Including Sensor Normalization - A system and method are described herein for a sensor device which biomechanically detects in real-time a user's movement state and posture and then provides real-time feedback to the user based on the user's real-time posture. The feedback is provided through immediate sensory feedback through the sensor device (e.g., a sound or vibration) as well as through an avatar within an associated application with which the sensor device communicates. The sensor device detects the user's movement state and posture by capturing data from a tri-axial accelerometer in the sensor device. Streamed data from the accelerometer is normalized to correct for sensor errors as well as variations in sensor placement and orientation. Normalization is based on accelerometer data collected while the user is wearing the device and performing specific actions.08-15-2013
20130207888METHOD AND APPARATUS FOR PRESENTING AN OPTION - A method of presenting an option comprises: calculating a movement range of an object; calculating an area of a display device based on the movement range; and presenting at least one option in the area of the display device.08-15-2013
20130207887HEADS-UP DISPLAY INCLUDING EYE TRACKING - Embodiments of an apparatus comprising a light guide including a proximal end, a distal end, a display positioned near the proximal end, an eye-tracking camera positioned at or near the proximal end to image eye-tracking radiation, a proximal optical element positioned in the light guide near the proximal end and a distal optical element positioned in the light guide near the distal end. The proximal optical element is optically coupled to the display, the eye-tracking camera and the distal optical element and the distal optical element is optically coupled to the proximal optical element, the ambient input region and the input/output region. Other embodiments are disclosed and claimed.08-15-2013
20130207886VIRTUAL-PHYSICAL ENVIRONMENTAL SIMULATION APPARATUS - A reactive virtual-physical perception suit apparatus, adapted for interactivity between a virtual environment and a physical environment is disclosed. A reactive virtual-physical perception circuit apparatus is adapted to process information transformation matrices between the virtual environment and the physical environment. The reactive virtual-physical perception suit apparatus is adapted for training environment emulations.08-15-2013
20130207885Motion Controlled Image Creation and/or Editing - Methods, apparatus, and computer readable medium for creating and/or editing images are disclosed. An electronic tablet device may include a display, touch sensor, a motion sensor, controller and speaker. The motion sensor may generate input signals indicative of a spatial movement of the electronic tablet device. The controller may receive the input signals and generate output signals that move a drawing tool across a canvas of the display based on the spatial movement of the electronic device. The output signals may further update the canvas to reflect an effect of moving the drawing tool across the canvas per the tilting movement.08-15-2013
20120268367Method and Apparatus for Communication Between Humans and Devices - This invention relates to methods and apparatus for improving communications between humans and devices. The invention provides a method of modulating operation of a device, comprising: providing an attentive user interface for obtaining information about an attentive state of a user; and modulating operation of a device on the basis of the obtained information, wherein the operation that is modulated is initiated by the device. Preferably, the information about the user's attentive state is eye contact of the user with the device that is sensed by the attentive user interface.10-25-2012
20120268361HAND-HELD ELECTRONIC DEVICE AND OPERATION METHOD APPLICABLE THERETO - An operation method, applicable to a hand-held electronic device having a display unit and a social networking share hardware button, includes: detecting whether the social networking share hardware button is triggered; and in response that the social networking share hardware button is triggered, posting a user share content on a social networking based on a content displayed on the display unit.10-25-2012
20110025602METHOD AND DEVICE FOR TACTILE PRESENTATION - A method for a tactile presentation of perceivable content. The method comprises receiving a data representing perceivable content, selecting a plurality of electric currents according to the data, each the electric current is associated with at least one of a plurality of regions of a solution having a plurality of macromolecules, and changing a level of acidity (pH) of at least one of the plurality of regions by applying a respective the electric current thereto. A proton concentration in the plurality of regions tactilely presents the perceivable content.02-03-2011
20110025601Virtual Controller For Visual Displays - Virtual controllers for visual displays are described. In one implementation, a camera captures an image of hands against a background. The image is segmented into hand areas and background areas. Various hand and finger gestures isolate parts of the background into independent areas, which are then assigned control parameters for manipulating the visual display. Multiple control parameters can be associated with attributes of multiple independent areas formed by two hands, for advanced control including simultaneous functions of clicking, selecting, executing, horizontal movement, vertical movement, scrolling, dragging, rotational movement, zooming, maximizing, minimizing, executing file functions, and executing menu choices.02-03-2011
20110025598Spatial, Multi-Modal Control Device For Use With Spatial Operating System - A system comprising an input device includes a detector coupled to a processor. The detector detects an orientation of the input device. The input device has multiple modal orientations corresponding to the orientation. The modal orientations correspond to multiple input modes of a gestural control system. The detector is coupled to the gestural control system and automatically controls selection of an input mode in response to the orientation.02-03-2011
20110025597Digital display device and image arrangement method using the same - A digital display device that implements a digital photo frame function. The digital display device includes a user input unit to receive control signals corresponding to a user operation, the control signals to control a plurality of images to be continuously displayed, a display controller including an image arrangement unit, the display controller to output display signals by storing image files or processing the stored image files according to the control signals, a storage unit having image files stored therein and a display unit to display the plurality of images on a screen according to the display signals, the image arrangement unit to arrange the plurality of images in an arrangement order for display based on comparison results that compare data for pairs of images selected from the plurality of images upon the control signals being input.02-03-2011
20100182229TERMINAL, DISPLAY APPARATUS, AND METHOD FOR TRANSMITTING AND RECEIVING DATA THEREOF - A terminal, a display apparatus, and a method of transceiving data are provided. The terminal transmits or receives data corresponding to a specific point to which light is emitted through a communication unit if a data transmission command or a data receiving command is input. Accordingly, the display apparatus and the terminal transceive data by directly communicating with each other.07-22-2010
20110069005DISPLAY APPARATUS - A display apparatus is provided. A frame is fixed and supported at the rear of a front panel forming the front portion of the display apparatus, and a separate bracket member is not mounted on the edges of the front panel. Thus, the front exterior of the display apparatus is neatly finished, and the display screen looks bigger than it actually is.03-24-2011
20120169590INFORMATION-PROCESSING SYSTEM, INFORMATION-PROCESSING APPARATUS, AND INFORMATION-PROCESSING METHOD - In an information-processing system shown in FIG. 07-05-2012
20100295770CONTROL METHOD FOR CONTROLLING REMOTE COMPUTER - The present invention relates to a control method of efficiently controlling a computer at a remote place using a limited input/output device and a remote communication terminal with limited memory capacity even in a communication network environment where the data transmission rate is limited and the transmission cost is high. The control method in accordance with the present invention includes an input method optimized for a limited input device of a terminal, a screen display method optimized for a small screen of a terminal, and a screen data transmission function optimized for a communication network speed, a transmission cost, and a limited memory capacity of a terminal.11-25-2010
20120075176METHOD AND APPARATUS OF RECOGNIZING GESTURE WITH UNTOUCHED WAY - Provided is a method of recognizing a gesture, which includes: storing sensing information of a sensor in a case where the sensing information is obtained by sensing an object within a preset distance from the sensor; and recognizing a gesture from the stored sensing information, wherein said storing of sensing information stores the sensing information obtained by the sensor during a preset time after the sensor senses an object within the preset distance. This method allows a terminal and contents to be controlled by recognizing a gesture of a user even though the user does not touch the terminal screen.03-29-2012
20120319946METHOD AND SYSTEM FOR THREE DIMENSIONAL INTERACTION OF A SUBJECT - Method, computer program and system for tracking movement of a subject. The method includes receiving data from a distributed network of camera sensors employing one or more emitted light sources associated with one or more of the one or more camera sensors to generate a volumetric three-dimensional representation of the subject, identifying a plurality of clusters within the volumetric three-dimensional representation that correspond to motion features indicative of movement of the motion features of the subject, presenting one or more objects on one or more three dimensional display screens, and using the plurality of fixed position sensors to track motion of the motion features of the subject and track manipulation of the motion features of the volumetric three-dimensional representation to determine interaction of one or more of the motion features of the subject and one or more of the one or more objects on the three dimensional display.12-20-2012
20110157001METHOD AND APPARATUS FOR DISPLAY FRAMEBUFFER PROCESSING - Various methods for display framebuffer processing are provided. One example method include determining, via a processor, that update criteria associated with a display region has been satisfied, and comparing current frame data for the display region to subsequent frame data for the display region to determine frame data changes associated with the display region. In this regard, the comparing is performed in response to the update criteria being satisfied. The example method may also include facilitating presentation of the frame data changes within the display region on a display. Similar and related example methods and example apparatuses are also provided.06-30-2011
20100039379Enhanced Multi-Touch Detection - Enhanced multi-touch detection, in which a graphical user interface for an application is projected onto a surface, and electromagnetic radiation is emitted. The electromagnetic radiation is collectively emitted by an array defining a layer aligned parallel with the surface and overlapping at least a region of the surface onto which the graphical user interface is projected. Electromagnetic radiation is detected that reflects off of an object interrupting the defined layer where the defined layer overlaps the region of the surface onto which the graphical user interface is projected, and indicating a position of the object is output.02-18-2010
20120032877Motion Driven Gestures For Customization In Augmented Reality Applications - A motion-driven user interface for mobile device-based augmented reality applications is described which provides a user with the ability to execute user interface input commands by physically manipulating the mobile device in space. The mobile device uses embedded sensors to identify the type and extent of the manipulation which cause execution of a corresponding user interface input command which can vary depending upon the operating context of the mobile device.02-09-2012
20120032876MEGA COMMUNICATION AND MEDIA APPARATUS CONFIGURED TO PROVIDE FASTER DATA TRANSMISSION SPEED AND TO GENERATE ELECTRICAL ENERGY - Disclosed embodiments comprise communication apparatus operatively configured with CMOS multiple antennas disposed on a chip for boosting communication signals to and for enabling faster data transmission speed and to provide interactive user interface. The communication apparatus is further configured to convert sound waves, vibrations, solar energy, wind force and pressure force into electrical energy communicable to a battery cell. Disclosed embodiment encompasses three modes of communications—the Cell phone, wireless Internet applications, and Global communication and media information. Embodiments provide communication apparatus operable to enhance mobile communication efficiency with touch sensitive display and provide energy harvesting platform on at least the housing for the apparatus and/or the circuit board configured with memories, processors, and modules. Embodiments provide advanced computing and media applications, including in-vehicle interactive communications and wireless Internet applications. Embodiments further provide a gaming device, a wireless media device configured with touch pads comprising sensors being embedded in silicon substrate and fused in micro fiber material having excellent electrical characteristics. Certain embodiments provide communication apparatus configured for voice enabled applications comprising human voice auditory operable to convert text into voice auditory and/or voice auditory into text applications.02-09-2012
20100013758HUMAN INTERFACE DEVICE (HID) - A user may wear the radio frequency human interface device on a body portion and move the body portion over any even and un-even surface that is not touch sensitive to provide inputs. The radio frequency human interface device may sense, encode, and provide the radio frequency signals to a computing system. The computing system may be provisioned with a radio frequency reader that may receive the radio frequency signal and decode the radio frequency signal before responding to the input. Also, a plurality of users may use radio frequency human interface devices to provide inputs to the computing system concurrently.01-21-2010
20110141011METHOD OF PERFORMING A GAZE-BASED INTERACTION BETWEEN A USER AND AN INTERACTIVE DISPLAY SYSTEM - The invention describes a method of performing a gaze-based interaction between a user (06-16-2011
20110279362DISPLAY DEVICE AND DISPLAY METHOD - A display device and the like which can provide an image showing the status of the past presentation more flexibly is provided. A display device includes an image generating section generating an instruction image which reflects an instruction content based on presentation data obtained by relating image data showing a displayed image to display time data showing a display time of the displayed image and based on instruction information showing the instruction content, a display section displaying the instruction image, and an updating section updating the presentation data based on the instruction, and the image generating section generates a reproduction target time specifying image including a time region which changes as time passes and a specifying region which moves on the time region according to an instruction position and shows a reproduction target time, and the display section displays the reproduction target time specifying image.11-17-2011
20130009859STEREOSCOPIC IMAGE DISPLAY DEVICE AND DRIVING METHOD THEREOF - A stereoscopic image display device and a driving method thereof, which correct the viewing position of a viewer in initial driving of a 3D display mode, are discussed. The stereoscopic image display device includes a display module, a barrier module, a position detector, and a position detector. The display module separates a left-eye image and a right-eye image to display a stereoscopic image. The barrier module is disposed in correspondence with the display module, and forms a light transmitting area for transmitting the left-eye image and right-eye image and a light blocking area for blocking the left-eye image and right-eye image. The position detector detects position information on a viewer which views the stereoscopic image displayed on the display module. The position detector corrects positions of the light transmitting area and light blocking area on the basis of viewing position information on the viewer.01-10-2013
20110279360IMAGE DISPLAY UNIT AND IMAGE FORMING APPARATUS INCLUDING THE SAME - An image display unit capable of displaying image data page-wise includes: a scanner portion; a display panel for displaying input image data in preview representation; an input condition determiner that compares the image data successively input through the scanner portion, as to input condition and determines whether there is any change in the image data input condition; and a display controller that, when the input condition determiner determines that the input image data has changed in the input condition, makes control such as to display the image data that was determined to have changed in the input condition and the image data input immediately before the image data in question, together on the display panel.11-17-2011
20110279363IMAGE FORMING APPARATUS AND DISPLAY CONSOLE DISPLAYING PREVIEW IMAGE - In order to provide a technique allowing easy confirmation of image contents even if the number of images increases, a display console includes a display device having an image displaying function and a display control unit controlling the display by dividing a display screen of the display device into an image preview area and another area. The display control unit switches, in accordance with a user instruction, between a fit-to-screen screen image in which area ratio between the preview area and another area has a first value, and a finish preview screen image, an image edition mode screen image, or a document display mode screen image, in which the size of another area is made smaller and the size of preview area is made larger than in the fit-to-screen screen image.11-17-2011
20110279361OPTICAL DETECTION DEVICE, DISPLAY DEVICE, AND ELECTRONIC APPARATUS - An optical detection device includes: a light source unit that emits source light; a curve-shaped light guide that includes: a light incident surface to which the source light is incident, the light incident surface being located in an end portion of the light guide; and a convex surface from which the source light received by the light incident surface is output; an emitting direction setting unit that receives the source light output from the convex surface of the light guide and sets an emitting direction of emitting light to a direction of a normal line of the convex surface; a light receiving unit that receives reflection light acquired by reflecting the emitting light off an object; and a detection unit that detects at least a direction in which the object is located based on the light reception in the light receiving unit.11-17-2011
20130009864METHOD AND APPARATUS FOR INTERFACING BETWEEN EXTERNAL DEVICE AND MOBILE DEVICE - An apparatus for interfacing between a mobile device and an external device is provided. The apparatus includes a mobile device which includes image data for the external device, and which transmits to the external device an image signal corresponding to the image data according to a selection, and an external device which displays the image data corresponding to the image signal received from the mobile device, which generates an input signal according to a user's input, and which transmits the generated input signal to the mobile device, wherein the image signal of the mobile device and the input signal of the external device are transmitted and received through a single interface used for a connection between the mobile device and the external device.01-10-2013
20130009867METHOD AND APPARATUS FOR DISPLAYING VIEW MODE USING FACE RECOGNITION - A method for displaying screen data according to determination of a view mode in a portable terminal, and an apparatus thereof, are provided. The method includes detecting an orientation change event of the portable terminal in a displayed state of the screen data, turning-on a camera module when the orientation change event is detected, determining an orientation of eyes of a user through face detection from an image captured by the camera module, determining a view mode of the portable terminal according to an orientation of the portable terminal and the orientation of the eyes of the user, and displaying screen data according to the determined view mode.01-10-2013
20130009869System and Method for Image Processing using Multi-touch Gestures - Various embodiments of a system and methods for processing digital images using multi-touch gestures are described. A multi-touch gestural input set which comprises a plurality of touch gestures may be applied to a display of an image. The gestural input set may include different gesture types, such as mobile and stationary gestures. Each gesture type may indicate a different image processing constraint that may be applied to modify the digital image. Stationary gestures may indicate constrained regions of the image that are not subject to modification. Mobile gestures may indicate regions of the image which may be subject to modification. Characteristics of the mobile gestures, such as velocity and/or pressure, may also indicate an amount by which an image may be modified over the region indicated by the mobile gesture. Image masks, which separate foreground and background regions of an image, may also be specified by the gestural input set.01-10-2013
20090027332MOTOR VEHICLE COCKPIT - A motor vehicle cockpit display system includes display units for displaying information arranged at different positions in a passenger compartment of the motor vehicle. A control arrangement controls the content of the information which is respectively displayed on the display units. A recording device senses in a contactless fashion an assignment of a user's limb to a first display unit and a gesture-dependent change in the assignment to a further display unit. Information relating to the change in the assignment is fed to the control arrangement, and the further display unit can be actuated to display the information of the first display unit by the control arrangement in accordance with the change in the assignment.01-29-2009
20100171691VIEWING IMAGES WITH TILT CONTROL ON A HAND-HELD DEVICE - A user interface suitable for use in cellular phones and personal digital assistants (PDAs), PC Tablets, as well as laptops, PCs, office equipment, medical equipment, or any other hand-held electronic device, that allows control of the image on the device display by tilting the device to either change the view in perspective, change the magnification, or both, concurrently, by moving the device. Thus, the tilt of the device controls the angle of view of the image, and moving the device perpendicular to the screen controls the magnification.07-08-2010
20130021237OPTICAL REMOTE CONTROL SYSTEM - An optical remote control system includes a home appliance and a remote controller. The home appliance operates according to a user command and its housing includes an opening. A status indicator light and a reference light are disposed within the opening. The status indicator light includes a visible light source, and the reference light includes a plurality of infrared light sources. The visible light source and the infrared light sources are disposed in a predetermined pattern. The remote controller includes an optical sensor configured to detect optical signals from the infrared light sources, thereby generating the user command accordingly.01-24-2013
20130135198Electronic Devices With Gaze Detection Capabilities - An electronic device may have gaze detection capabilities that allow the device to detect when a user is looking at the device. The electronic device may implement a power management scheme using the results of gaze detection operations. When the device detects that the user has looked away from the device, the device may dim a display screen and may perform other suitable actions. The device may pause a video playback operation when the device detects that the user has looked away from the device. The device may resume the video playback operation when the device detects that the user is looking towards the device. Gaze detector circuitry may be powered down when sensor data indicates that gazed detection readings will not be reliable or are not needed.05-30-2013
20110279364Information Display Apparatus with Proximity Detection Performance and Information Display Method Using the Same - An information display apparatus with proximity detection performance contains a display device that displays image information, a sensor constituted of plural detection electrodes, and an adjusting device of detection resolution that adjusts the detection resolution to be detected based on a distance between the sensor and an object that is contacted to any one of the detection electrodes.11-17-2011
20090309830CONTROL APPARATUS, INPUT APPARATUS, CONTROL SYSTEM, HANDHELD INFORMATION PROCESSING APPARATUS, CONTROL METHOD, AND PROGRAM THEREFOR - A control apparatus, an input apparatus, a control system, a control method, and a program therefor that are capable of improving operability when a user operates a GUI displayed on a screen by a pointer using the input apparatus are provided. An MPU of a control apparatus sets weighting factors for each region sectioning a screen. The MPU multiplies the weighting factors to corresponding displacement amounts to independently calculate displacement amounts of a pointer on the screen. Accordingly, a movement direction of the pointer can be biased in a predetermined direction. Thus, when a user operates an input apparatus to select an icon aligned in a 1-dimensional direction on the screen, for example, an operation of the pointer can be restricted to that 1-dimensional direction. Therefore the user can easily select the icon, thus improving operability of the pointer.12-17-2009
20090309827METHOD AND APPARATUS FOR AUTHORING TACTILE INFORMATION, AND COMPUTER READABLE MEDIUM INCLUDING THE METHOD - The present invention relates to a method and apparatus for authoring tactile information that generates a tactile video for representing tactile information in the form of an intensity value of a pixel in order to author tactile information. More particularly, the present invention relates to a method and apparatus for authoring tactile information that represents an intensity value of a pixel of a tactile video in a drawing manner on a tactile video input window while outputting and referring to audiovisual media. The present invention provides an apparatus for authoring a tactile video that represents information about driving strength of an actuator array of a tactile display apparatus in the form of an intensity value of a pixel. The apparatus includes a tactile video generating unit that including a configuration module and a tactile video authoring module. The configuration module performs configuration to author a tactile video. The tactile video authoring module includes a video clip playback window that outputs information about audiovisual media, such as a video clip or a text, becoming a base of authoring the tactile video by frames, and a tactile video input window to which an intensity value of each of pixels of the tactile video is input in a drawing manner. The tactile video is generated by frames.12-17-2009
20090309826Systems and devices - The present disclosure relates to systems and devices that may be configured to facilitate content projection.12-17-2009
20100245234Portable Electronic Device with Low Dexterity Requirement Input Means - A handheld-size electronic device is provided with a display screen, or front housing or bezel surrounding the display screen, that is moveably mounted to the remainder of the electronic device. Detectors are provided for detecting movement of the display screen, or surrounding front housing part, and a microprocessor coupled to the detectors controls the electronic device based on that movement. This mode of user input allows users to control the electronic device with a quick hand motion and without requiring the level of focus that is typically required to operate the small buttons of a wireless device. This mode of user input can be use for various purposes including, for example, controlling digital music playback.09-30-2010
20110279359SYSTEMS AND METHODS FOR MONITORING MOTION SENSOR SIGNALS AND ADJUSTING INTERACTION MODES - Described herein are systems and methods for recognizing when a user of an interactive application is frustrated and for responding to the user's frustration by changing an interaction mode. Signals arising from motion sensors included in user equipment are monitored for patterns indicative of user frustration. In response to detecting a frustration pattern in a motion sensor signal, an interactive application display is changed.11-17-2011
20120086631SYSTEM FOR ENABLING A HANDHELD DEVICE TO CAPTURE VIDEO OF AN INTERACTIVE APPLICATION - Methods and systems for enabling a handheld device to capture video of an interactive session of an interactive application presented on a main display are provided. An interactive session of the interactive application defines interactivity between a user and the interactive application. An initial position and orientation of a handheld device operated by a spectator are determined. A current state of the interactive application based on the interactivity between the user and the interactive application is determined The position and orientation of the handheld device are tracked during the interactive session. A spectator video stream of the interactive session based on the current state of the interactive application and the tracked position and orientation of the handheld device is generated. The spectator video stream is rendered on a handheld display of the handheld device.04-12-2012
20120086634MULTI-DIRECTION INPUT DEVICE - Provided is a multi-direction input device. The multi-direction input device includes a manipulation member moved in a multi-direction by a user's manipulation, a hinge part disposed to surround at least one side of the manipulation member, the hinge part returning to an initial position of the manipulation member by elasticity, and a fixing member configured to fix at least one portion of the hinge part.04-12-2012
20080309616Alertness testing method and apparatus - A method and apparatus for detecting the alertness of an equipment operator by displaying a moving icon, and asking the operator to track the movements of the icon, either by following it with the eyes in a head mounted display, or by following it with a finger on a touch screen. The operator's performance can be measured by tracking the gaze of the operator's eyes, or by tracking the operator's finger movements. The performance of the operator can be compared to that particular person's history of test results, or to a data base of test results of other operators. The characteristics of the icon can be varied, and distractions can be provided on the display or screen. Control of the display or screen, tracking of the operator's eyes or finger, and analysis of the test results, can all be performed by a computer.12-18-2008
20110298705THREE-DIMENSIONAL INPUT CONTROL DEVICE - Some embodiments provide force input control devices for sensing vector forces comprising: a sensor die comprising: a rigid island, an elastic element coupled to the rigid island, die frame coupled to a periphery of the elastic element, one or more stress sensitive components on the elastic element, and signal processing IC, where the sensor die is sensitive to a magnitude and a direction of a force applied to the rigid island within the sensor die, where the sensor die is coupled electrically and mechanically to a substrate, a spring element coupling an external button, where the force is applied, to the rigid island element, wherein the spring element has a flat geometry and located in a plane parallel to a plane of the substrate, where the spring element is configured to translate a deflection of the button into an allowable force applied to the rigid island.12-08-2011
20100053076Display control based on bendable interface containing electronic device conformation sequence status - A system includes, but is not limited to: one and one or more display control modules configured to direct controlling display of one or more portions of the bendable interface containing electronic device regarding display of second information in response to the information associated with the one or more changes in one or more sequences of two or more conformations of the one or more portions of the one or more regions of the bendable interface containing electronic device. In addition to the foregoing, other related system/system aspects are described in the claims, drawings, and text forming a part of the present disclosure.03-04-2010
20120105317MOBILE ELECTRONIC DEVICE - According to an aspect, a mobile electronic includes a first display unit, a second display unit, an input unit, and a control unit. The first display unit displays a first image. The second display unit displays a second image. To the input unit, an instruction is input. The control unit causes the second display unit to display the first image, as the second image, when a first period of time has passed since the first image is displayed by the first display unit.05-03-2012
20120105313PROJECTION DEVICE HAVING DISPLAY CONTROL FUNCTION AND METHOD THEREOF - A projection device having a display control function is provided. The projection device includes a document unit, a display unit, a lens module and a sensor unit. The sense unit senses distance and direction of movement of the projection device. The document unit scrolls through content of an opened document according the distance and direction of the sensed movement of the projection device from the sensor unit to select the content to be displayed, and control display of the selected content of a currently opened document on the display unit. The lens module projects the selected content currently displayed by the display unit. A method with a display control function is also provided.05-03-2012
20110285618ACTIVE INTERFACE CONTROLS HAVING BI-STABLE ACTUATION AND INTRINSIC SENSING CAPABILITY - An active interface control shiftable between deployed and stowed configurations, and including an active material actuator employing a bi-stable mechanism configured to increase the actuator stroke length, and/or presenting intrinsic or external sensing capability, and reconfigurable displays comprising separately shiftable sets of controls.11-24-2011
20110285620GESTURE RECOGNIZER SYSTEM ARCHITECTURE - Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data.11-24-2011
20110285619Method for identifying a sequence of input signals - A method for identifying a sequence of input signals is proposed, a first input signal being identified in a first method step, a second input signal being identified within a predefined reference time in a second method step, and a time interval between the first input signal and the second input signal being determined in a third method step and furthermore the adapted reference time being set as a function of the time interval in a fourth method step.11-24-2011
20090273559GAME DEVICE THAT GENERATES A DISPLAY WITH A SIMULATED BODY IMAGE AND METHODS FOR USE THEREWITH - A game device includes a first receiver that receives body motion signals from a plurality of remote motion sensing device coupled to a user's body. A user data generation module generates simulated body image data. A processor executes a game application that generates display signals for display on a display device, wherein the display signals are generated based on the simulated body image data.11-05-2009
20090278796PROGRAM, INFORMATION STORAGE MEDIUM, DETERMINATION DEVICE, AND DETERMINATION METHOD - A determination device stores a plurality of pieces of reference data associated with a predetermined movement pattern of a controller, and determines whether or not output values that respectively have a given relationship with reference data have been output from an acceleration sensor in a predetermined order within an input reception period in which an input that moves the controller in the predetermined movement pattern is received. The determination device receives an input that moves the controller in the predetermined movement pattern and performs a game process when the determination device has determined that the output values that respectively have the given relationship with the reference data have been output from the acceleration sensor in the predetermined order within the input reception period.11-12-2009
20090278793INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND MEDIUM RECORDING INFORMATION PROCESSING PROGRAM - To provide an information processing device, an information processing method, and a medium recording an information processing program that can be operated easily.11-12-2009
20110298703INFORMATION PROCESSING DEVICE AND COMPUTER READABLE RECORDING MEDIUM - An information processing device that is connected to a projecting device that projects an annotation image input from an external, terminal a projection area including an object and a background, and is connected to an image capture device that captures an image of the projection area including the object and the background, includes: a detecting unit that detects movement of the object from an image captured by the image capture device; an extracting unit that extracts a changed region that is caused in the captured image by the movement of the object; and a processing unit that performs processing on at least one of the captured image and the annotation image, when the annotation image exists in the changed region.12-08-2011
20110298702USER INTERFACE DEVICE AND INPUT METHOD - Provided is a user interface device (12-08-2011
20110298704THREE-DIMENSIONAL IMAGING AND DISPLAY SYSTEM - A three-dimensional imaging and display system is provided in which user input is optically detected in an imaging volume by measuring the path length of an amplitude modulated scanning beam as a function of the phase shift thereof. Visual image user feedback concerning the detected user input is presented.12-08-2011
20110298701Presenting Information to a User Based on the Current State of a User Device - Information is presented to a user based on a current state of an end-user device (e.g., a mobile phone). In one embodiment, a method includes: detecting, via a user device, a predefined user motion of a user (e.g., a flick of a trackball or gesture on a touch screen); determining a current state of the user device based on at least one characteristic; and in response to detecting the user motion, presenting, via a display of the user device, information (e.g., a person profile) to the user based on the current state.12-08-2011
20110298699INPUT APPARATUS, INFORMATION PROCESSING APPARATUS, OPERATION INPUT METHOD, AND SENSOR SHEET - Provided is an input apparatus including an input operation section, a capacitive sensor, and an output section. The input operation section includes an operation member configured to mechanically receive an input operation, and a first detection circuit configured to detect the input operation of the operation member. The capacitive sensor includes a plurality of electrodes which are arranged around the operation member, and each of which has a capacitance variable due to an approaching of a detection target, and a second detection circuit configured to detect the capacitance of each of the plurality of electrodes. The output section is configured to output an output of the first detection circuit and an output of the second detection circuit.12-08-2011
20110298700OPERATION TERMINAL, ELECTRONIC UNIT, AND ELECTRONIC UNIT SYSTEM - An operation terminal includes: a posture detection section detecting a posture of the operation terminal, a change of the posture, or both thereof; a mode selection section selecting, based on a detection result of the posture detection section, an operation mode from a plurality of operation modes including a gesture mode and a non-gesture mode; and a transmission section sending a control command corresponding to the detection result of the posture detection section to an electronic unit when the gesture mode is currently selected.12-08-2011
20110298698MANUAL HUMAN MACHINE INTERFACE OPERATION SYSTEM AND METHOD THEREOF - A manual human machine interface operation system and method thereof are disclosed. In embodiment, this manual human machine interface operation system extracts user's arm image and palm image from the images capture by at least two cameras, and then calculates user's arm coordinate, so that user can select the object shown on the human machine interface by manually using his/her arm. The system then recognizes a hand posture according to the palm image and determines an instruction mapped to this hand posture. This instruction is then performed on the selected object. In embodiment, the hand posture can be a pointing posture, grab posture or a release posture.12-08-2011
20090201247Communications with a Haptic Interface Device from a Host Computer - The present invention comprises methods and apparatuses that can provide reliable communications between a computer and a haptic interface device. The methods and apparatuses can provide communication that is more secure against errors, failures, or tampering than previous approaches. Haptic devices allow a user to communicate with computer applications using the user's sense of touch, for example by applying and sensing forces with the haptic device. The host computer must be able to communicate with the haptic device in a robust and safe manner. The present invention includes a novel method of accomplishing such communication; a computer-readable medium that, when applied to a computer, causes the computer to communicate according to such a method; and a computer system having a host computer and a haptic device communicating according to such a method.08-13-2009
20090085863MOTION BASED DISPLAY MANAGEMENT - A display manager is configured to handle the drawing of windows on one or more displays for an application differently based on detected motion information that is associated with a device. The display manager may not display windows for some applications while motion is detected, while the display manager may display windows for other applications even when motion is detected. Motion enabled applications may interact with the display manager and motion information to determine how to display windows while motion is detected.04-02-2009
20110291923APPLICATION DEVICE OF ELECTRONIC PAPER SOFTWARE - The present invention relates to an application device of electronic paper software, generally including a first flexible display module, a second flexible display module, and a flexible input module, wherein the second flexible display module is formed on one side of the first flexible display module and the flexible input module extends from one side of the first flexible display module or the second flexible display module. The first flexible display module and the second flexible display module may display various and different use interfaces and display interfaces to thereby provide a digital product that is of low power consumption, light-weight and compactness, water resistance, vibration resistance, and being hard to break.12-01-2011
20110291922Systems and Methods For Automatic Disable of Input Devices - Systems, methods, apparatuses and computer program products configured to provide intelligent filtering techniques to reduce errant device inputs are described. For example, filtering out the data input from a touch pad while typing on the keyboard or operating with a pointing stick, even if the touch pad is sensing contact, is augmented by continuing to filter data from the touch pad until the contact on the touch pad ends.12-01-2011
20120026085IMAGE CONTRAST ENHANCEMENT IN DEPTH SENSOR - Embodiments related to the enhancement of contrast in an image pattern in a structured light depth sensor are disclosed. For example, one disclosed embodiment provides, in a structured light depth sensor system comprising a structured light depth sensor, a method comprising projecting a light pattern onto an object, detecting via an image sensor an image of the light pattern as reflected from the object, increasing a contrast of the light pattern relative to ambient light present in the image of the light pattern as reflected from the object to form a contrast-enhanced image of the light pattern as reflected from the object, and based upon a motion of the object as detected via the contrast-enhanced image of the light pattern, controlling an application that is providing output to a display.02-02-2012
20120026081SYSTEM AND METHOD FOR USING PAPER AS AN INTERFACE TO COMPUTER APPLICATIONS - A system and method for using paper to interface with handwritten annotations and/or pre-defined templates with one or more computer applications is disclosed. In one embodiment, the method includes imaging content in the paper including pre-defined handwritten commands, associated syntax, one or more computer application identifiers and pointed data which is already existing on the paper, analyzing the imaged content to identify the pre-defined handwritten commands, the one or more computer applications associated with the one or more computer application identifiers, the associated syntax and the pointed data, extracting the pointed data into a specified format associated with the one or more computer applications, executing the one or more computer applications based on the identified pre-defined handwritten commands, the one or more computer application identifiers and the associated syntax, and importing the extracted pointed data into the one or more executed computer applications.02-02-2012
20090273560Sensor-based distributed tangible user interface - A distributed tangible user interface comprises compact, self-powered, tangible user interface manipulative devices having sensing, display, and wireless communication capabilities, along with one or more associated digital content or other interactive software management applications. The manipulative devices display visual representations of digital content or program controls and can be physically manipulated as a group by a user for interaction with the digital information or software application. A controller on each manipulative device receives and processes data from a movement sensor, initiating behavior on the manipulative and/or forwarding the results to a management application that uses the information to manage the digital content, software application, and/or the manipulative devices. The manipulative devices may also detect the proximity and identity of other manipulative devices, responding to and/or forwarding that information to the management application, and may have feedback devices for presenting responsive information to the user.11-05-2009
20120026078Interactive Projector Device - An interactive device includes a first sensor, a first button, an accelerometer, a processor, and a transmitter. The first sensor is configured to receive coordinate information from a coordinate projection of a projector. The first button is configured to transmit an erase signal when the first button is pressed. The accelerometer is configured to provide angle information for the interactive device. The processor is in communication with the first sensor, with the first button, and with the accelerometer. The processor configured to receive the coordinate information from the first sensor, to receive the erase signal from the first button, to receive the angle information from the accelerometer, and to generate erase information based on the coordinate information, the erase signal, and the angle information. The transmitter is in communication with the processor, and is configured to transmit a delete request including the erase information received from the processor to the projector.02-02-2012
20120026077MAPPING TRACKPAD OPERATIONS TO TOUCHSCREEN EVENTS - In general, this disclosure describes techniques for mapping trackpad interactions and operations to touchscreen events without the use of a touchscreen user interface. In one example, a method includes receiving, via a trackpad device coupled to a computing device, touch-based input comprising one or more gestures, wherein the trackpad device is physically distinct from a display device coupled to the computing device. The method further includes determining, by the computing device, a trackpad operation based upon the touch-based input, and determining, by the computing device, a touchscreen event based upon a mapping of the trackpad operation to the touchscreen event, wherein the touchscreen event is determined without receiving any user input from a touchscreen device. The method further includes generating, by the computing device, the touchscreen event for processing by an application executing on the computing device, wherein the application is designed to process touchscreen events initiated by touchscreen devices.02-02-2012
20090002315System And Method For Manipulation Of Sound Data Using Haptic Feedback - In an embodiment, a device which comprises means for generating an audio signal based on sound data, the audio signal configured to produce sound from an audio producing device; means for generating a haptic command based on the sound data, the haptic command configured to cause a haptic feedback device to output a haptic sensation, the haptic sensation being associated with at least one characteristic of the sound data; and means for receiving a navigation command from a user experiencing the haptic sensation via the haptic feedback device, the navigation command associated with the sound data and based, at least in part, on the haptic sensation.01-01-2009
20110134027INPUT DEVICE - Used in a multimedia operating system for controlling multiple computers, an input device includes a memory unit adapted for storing the ID codes of the computers, a control unit adapted for receiving an external switching signal and fetching the ID code from the memory unit subject to the content of the received external switching signal and then producing a switching packet containing the fetched ID code, a wireless transmitter and receiver unit adapted for transmitting the switching packet produced by the control unit to the computers, and a power supply unit adapted for providing the input device with the necessary working power supply.06-09-2011
20100053073Display control based on bendable display containing electronic device conformation sequence status - A method includes, but is not limited to: obtaining information associated with one or more sequences of two or more conformations of one or more portions of one or more regions of a bendable display containing electronic device and controlling display of one or more portions of the bendable display containing electronic device regarding display of second information in response to the information associated with the one or more sequences of two or more conformations of the one or more portions of the one or more regions of the bendable display containing electronic device. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure.03-04-2010
20120229373IMAGE DISPLAY CONTROL APPARATUS INCLUDING IMAGE SHOOTING UNIT - An image display control apparatus includes a CPU performing face recognition within an image shot by a camera and detecting a gaze direction from a recognized face. The CPU then changes the image to a processed image at a predetermined rate in the case where at least one of the gazes of detected faces is directed to the displayed image.09-13-2012
20100090945VIRTUAL INPUT SYSTEM AND METHOD - The invention provides a virtual input system. The virtual input system comprises a trajectory generating apparatus and a receiving apparatus. The receiving apparatus comprises a sensing module, a coding module, a database, and a comparing module. The sensing module is used for sensing a trajectory information of the trajectory generating apparatus. The coding module converts the trajectory information to a specific code series according to a coding rule. The database stores a plurality of reference code series and a plurality of reference symbols corresponding to the plurality of reference code series. The comparing module compares the specific code series with the plurality of reference code series to determine at least one candidate symbol from the plurality of reference symbols.04-15-2010
20100090947System and Method for Gesture Based Control System - The system provides a gestural interface to various visually presented elements, presented on a display screen or screens. A gestural vocabulary includes ‘instantaneous’ commands, in which forming one or both hands into the appropriate ‘pose’ results in an immediate, one-time action; and ‘spatial’ commands, in which the operator either refers directly to elements on the screen by way of literal ‘pointing’ gestures or performs navigational maneuvers by way of relative or “offset” gestures. The system contemplates the ability to identify the users hands in the form of a glove or gloves with certain indicia provided thereon, or any suitable means for providing recognizable indicia on a user's hands or body parts. A system of cameras can detect the position, orientation, and movement of the user's hands and translate that information into executable commands.04-15-2010
20100090948Apparatus, system, method, and program for processing information - An information processing apparatus includes a sensor for generating a sensor output signal responsive to the three-dimensional coordinate position of a detection target in a monitor space by detecting a capacitance in the monitor space, and outputting the sensor output signal, a position detector for detecting the three-dimensional coordinate position of the detection target in the monitor space from the sensor output signal of the sensor, a storage unit for storing coordinate information identifying a three-dimensional space region set in the monitor space, a determining unit for determining whether the three-dimensional coordinate position of the detection target in the monitor space is contained in the three-dimensional set space region, based on the three-dimensional coordinate position of the detection target detected by the position detector and the coordinate information stored on the storage, and an output unit for outputting determination results of the determining unit.04-15-2010
20080309615Storage medium storing information processing program and information processing device - An object has a plurality of joints 12-18-2008
20090128483ADVANCED NAVIGATION TECHNIQUES FOR PORTABLE DEVICES - The present invention provides a unique system and method that facilitates navigating smoothly and gracefully through any type of content viewable on portable devices such as cell-phones, PDAs, and/or any other hybrids thereof. In addition, such navigation can be performed while preserving perspective and context with respect to a larger amount of content. Pointing devices can also be used to navigate through content—the amount or detail of the content being dependant on the speed of the pointing device. Additionally, a semi-transparent overview of content can be overlaid a zoomed-in portion of content to provide perspective to the zoomed in portion. Content shown in the semi-transparent overview can depend on the location of the pointing device with respect to the content.05-21-2009
20100033427Computer Image and Audio Processing of Intensity and Input Devices for Interfacing with a Computer Program - A method and system for determining an intensity value of an interaction with a computer program is described. The method and device includes capturing an image of a capture zone, identifying an input object in the image, identifying an initial value of a parameter of the input object, capturing a second image of the capture zone, and identifying a second value of the parameter of the input object. The parameter identifies one or more of a shape, color, or brightness of the input object and is affected by human manipulation of the input object. The extent of change in the parameter is calculated, which is the difference between the second value and the first value. An activity input is provided to the computer program, the activity input including an intensity value representing the extent of change of the parameter. A method for detecting an intensity value from sound generating input objects, and a computer video game are also described.02-11-2010
20100033424ELECTRONIC APPARTUS AND CONTROL METHOD THEREFOR - An electronic apparatus and a control method are provided that are capable of reducing power consumption. The electronic apparatus having a normal mode in which first electric power is consumed and a power-saving mode in which second electric power lower than the first electric power is consumed includes a first sensor and a second sensor whose power consumption is lower than that of the first sensor. In the power-saving mode, supply of power to the first sensor is restricted, the second sensor is set to the power-saving mode, a trigger for restoring the power-saving mode to the normal mode is detected by using the second sensor set to the power-saving mode, and the power-saving mode is restored to the normal mode based on the detected trigger.02-11-2010
20100033423Portable Electronic Apparatus and Input Operation Determining Method - A portable electronic apparatus 02-11-2010
20100033422SYSTEMS AND METHODS FOR PROCESSING MOTION SENSOR GENERATED DATA - Systems and methods for processing data from a motion sensor to detect intentional movements of a device are provided. An electronic device having a motion sensor may process motion sensor data along one or more dimensions to generate an acceleration value representative of the movement of the electronic device. The electronic device may then determine whether the acceleration value changes from less than a low threshold, to more than a high threshold, and again to less than the low threshold within a particular amount of time, reflecting an intentional movement of the electronic device by the user. In response to determining that the acceleration value is associated with an intentional movement of the electronic device, the electronic device may perform a particular event or operation. For example, in response to detecting that an electronic device has been shaken, the electronic device may shuffle a media playlist.02-11-2010
20090189852INDEX WHEEL HAVING NOTATIONS AND METHOD OF MANUFACTURE THEREOF - An index wheel having notations adopted on an electronic device to be freely rotated to generate signal command output includes at least an inner axis layer and an operation layer encasing the inner axis layer in a coaxial manner and movable together at the same time. The operation layer is made from a ceramic material and formed integrally by molding and sintering. The operation layer also has a graphic notation zone on the surface that contains desired graphics or texts. Thus the profile of the index wheel is more versatile and provides a greater appeal to users.07-30-2009
20090189854USER INTERFACE CONTROLLER FOR A COMPUTER - User interface controller for controlling a computer comprising: 07-30-2009
20090189853CHARACTER INPUT DEVICE - A character input device is disclosed. In one embodiment, the device includes i) a base including an input region, ii) two input units disposed in the input region, wherein each of the input units is disposed to perform direction inputs of more than two steps that selects any one of a plurality of direction instruction locations that are radially spaced apart from each other from each of reference locations within the input region in each of the direction instruction locations, iii) a direction input detecting unit detecting whether the direction inputs and a multiple input are performed and iv) a controller discriminating and inputting a first character that is redundantly allocated at the corresponding direction instruction location according to the direction instruction location in which the direction inputs are performed and whether the multiple input is performed. The character input device having the construction described above includes two sets of input units that input more than one phoneme at an operation, thereby doubling input quantity and simultaneously inputting a character quickly and accurately.07-30-2009
20090184922Display indicator controlled by changing an angular orientation of a remote wireless-display controller - A remote controller for controlling a presentation image is disclosed. The remote controller includes gyroscope to detect the movement and angular speed of the remote controller and generate corresponding signals for transmitting to a computer or projection system. The movement or angular speed signals are processed and applied to move a display cursor or highlight indicator in different areas of the display image according to the movements and angular speed and positions of the remote controller thus enhancing the control the image of the presentation without requiring the presenter to look away from the screen in search of many different push buttons to control the presentation images.07-23-2009
20110215996COMPUTER AND METHOD FOR OPERATING THE SAME - A computer includes a touch element, a projection module, and a calculating element. The touch element has an operating area. When the operating area is touched, the touch element generates an input signal. The projection module is used for projecting an operating image in the operating area. The calculating element is connected with the projection module and the touch element and is used for processing the input signal to transform the input signal to a touch signal.09-08-2011
20120026084SIGNALING DEVICE POSITION DETERMINATION - A system and method for providing user input to a device. A system includes a light source, a user positioned signaling device, an image capture device, and an image processor. The user positioned signaling device includes a retroreflective structure and a polarization retarder. The image capture device captures images of the signaling device. The image processor processes the captured images and determines a position of the signaling device based, at least in part, on light polarized and reflected by the signaling device.02-02-2012
20120026083INTERFACE APPARATUS AND METHOD FOR CONTROLLING A DEVICE - A specific site of a user's body is detected from an input image, it is detected on the basis of a moving speed and a moving direction of the specific site whether the specific site makes a feeding motion in which the specific site moves in any direction, and when the feeding motion is detected, a control command for a device is changed.02-02-2012
20120026082DISPLAY DEVICE - In order to maintain a high visual image quality and save power, a display device includes: R sub-pixels, G sub-pixels, B sub-pixels, and W sub-pixels; and a human detection sensor (02-02-2012
20120026080ELECTRONIC DEVICE AND UNLOCKING METHOD THEREOF - An electronic device and a method enables an unlock operation of the electronic device. When the electronic device in a lock state is moved for the unlock operation, the electronic device receives a three-axis acceleration vector of the electronic device from an accelerometer. The electronic device analyzes three movement directions of the electronic device along three coordinate axes. The electronic device determines whether the analyzed three movement directions are the same as three predetermined movement directions along the three coordinate axes. If the analyzed three movement directions are the same as the three predetermined movement directions along the three coordinate axes, the electronic device is changed from the lock state to an unlock state.02-02-2012
20120026079USING A DISPLAY ABSTRACTION TO CONTROL A DISPLAY - The disclosed embodiments relate to a system for controlling a display. This system includes a generic display-control interface which facilitates controlling the display, and a pluggable display-control module including code that implements a standardized set of display-control commands. The system also includes a plug-in framework that houses the pluggable display-control module and enables the generic display-control interface to communicate with the pluggable display-control module. In some embodiments, the system also includes a generic transport interface which facilitates communicating with the display, and a pluggable transport module including code that implements a standardized transport protocol. In these embodiments, the plug-in framework houses the pluggable transport module and enables the pluggable display-control module to communicate with the pluggable transport module.02-02-2012
20110193771ELECTRONIC DEVICE CONTROLLABLE BY PHYSICAL DEFORMATION - An electronic apparatus includes a display; at least two corners and, among them, at least two bendable corners; one or more sensors arranged to detect the state of bending of at least two corners, here referred to as actuating corners, among the at least two bendable corners; and a controller for controlling the position within the display of an element displayed on the display based on the state of bending of the actuating corners. A method and a computer program are also disclosed.08-11-2011
20090179853Method of employing a gaze direction tracking system for control of a computer - A method of employing a gaze direction tracking system for control of a computer comprises the steps of: providing a computer display incorporating a screen and at least one off-screen control target (07-16-2009
20080273008Interactive image game device for exercise and massage - An interactive image game device for exercise and massage couples exercise, massage and game software together to form an interactive exercise and massage device which includes a plurality of sensors controlled by a user's feet or hands and a display unit which displays game pictures to provide interaction so that users can perform exercise, massage and game playing concurrently to enjoy greater pleasure when in use.11-06-2008
20100039371ARRANGEMENT FOR SELECTIVELY VIEWING AND HIDING USER INTERFACE ITEMS ON A BODY OF A COMMUNICATION APPARATUS, AND COMMUNICATION APPARATUS COMPRISING SUCH ARRANGEMENT - An arrangement for selectively viewing and hiding user interface items on a body of a communication apparatus is disclosed. The arrangement comprises a light source arranged inside the body and arranged to project light towards a surface of the body such that when the light source is in an on-state, the user interface item is viewed on the surface of the body; and a polarizing layer arranged with the surface of the body at least where the user interface item is to be viewed, such that the surface there, when the light source is in an off-state and the user interface item is hidden, appears in a primary color of the body. A communication apparatus comprising such arrangement is also disclosed.02-18-2010
20100066663REMOTE CONTROL POINTING TECHNOLOGY - A system is disclosed comprising a pointing device (03-18-2010
20120139828Communication And Skills Training Using Interactive Virtual Humans - A system for providing interaction between a virtual human and a user, the system comprising: a tangible interface providing a physical interface between the user and the virtual human, an imaging system directed towards the physical interface to provide images of the user interacting with the tangible interface; a tracking system tracking at least one position or the user; a microphone capturing speech from the user; a simulation system receiving inputs from the tangible interface, the imaging system, the tracking system and the microphone, the simulation system generating output signals corresponding to the virtual human; and a display presenting the output signals to the user.06-07-2012
20090153467System and method for connection detection - A system and method for detecting a display connection. The system includes a transmitter operable to be coupled to a receiver. The transmitter includes a controller operable to generate a connect signal when the receiver is coupled to the controller and is further operable to generate a disconnect signal when the receiver is uncoupled from the controller. The transmitter further includes a detection component operable to detect the connect signal and is further operable to detect the disconnect signal.06-18-2009
20090153468Virtual Interface System - The invention relates to a virtual interface system, to a method of providing a virtual interface, and to a data storage medium having stored thereon computer code means for instructing a computer system to execute a method of providing a virtual interface. A virtual interface system comprises a camera; a processor coupled to the camera for receiving and processing video data representing a video feed captured by the camera; a display coupled to the processor and the camera for displaying first and second interface elements superimposed with the video feed from the camera in response to display data from the processor, the second interface element being displayed at a fixed location on the display; wherein the processor tracks a motion action of a user based on the video data received from the camera, controls a display location of the first interface element on the display based on the tracked motion action; and determines a user input based on a relative position of the first and second interface elements on the display.06-18-2009
20090153466Method and System for Optimizing Scrolling and Selection Activity - Described are a method, a device, and a system for activating and deactivating a scrolling operation. The method includes receiving an input signal from an input interface on a mobile unit (“MU”), activating a scrolling operation of a display of the MU, sensing at least one of a motion and an orientation of the MU, and scrolling the display of the MU based on the one of the sensed motion and the sensed orientation of the MU. The device includes a display, an input interface for receiving an input signal, at least one sensor for sensing at least one of an orientation and a motion of the mobile computing device, and a processor receiving the input signal from the input interface, activating a scrolling operation of the display and scrolling the display of the device based on the one of the sensed motion and the sensed orientation of the device.06-18-2009
20090153471Apparatus and method for inputting characters - The present invention provides an apparatus and method for inputting characters, which allow various characters to be inputted to a multimedia device. The character input apparatus preferably includes a user manipulation unit for generating one or more stroke signals and/or one or more arc signals, and a character detection unit for detecting one or more characters using the stroke signals and/or the arc signals. The stroke signals and/or the arc signals are signals corresponding to strokes and/or arcs constituting respective characters, and the characters are data about the characters including the strokes and/or the arcs corresponding to the stroke signals and/or the arc signals.06-18-2009
20090153469Input Device and Handheld Electronic Device - An input device and a handheld electronic device comprising the input device are provided. The input device comprises an elastic sheet and a switch sheet, both with alignment through-holes for precise alignment. Thereby, a protrusion of the elastic sheet is disposed precisely above a switch of the switch sheet. Because of the inerrable assembly of the input device, the handheld electronic device will have an acute response to the depression made by the user.06-18-2009
20110169729Method and an apparatus for performing interaction between a mobile device and a screen - A method for performing an interaction between a mobile device and a screen having a plurality of NFC tags comprising data which can be read by said mobile device by using an NFC reader module of said mobile device to read a tag, wherein an image which is part of an application service is displayed on said screen such that one or more tags of the screen correspond to the indication of respective inputs or selections which the user may choose through his mobile phone when using said application service, wherein said image being displayed on said screen is controlled by a server on which said application service is running, said server being connected to said mobile phone of said user through a network link.07-14-2011
20090146948Apparatus for Providing Sensing Information - The invention relates to an apparatus for providing sensing information which includes a surface-sensing information combiner for collecting tactile sensing information on a surface of an object and accompanying sensing information attendant on the tactile sensing information of the surface of the object to produce surface-sensing information or edit the produced surface-sensing information. The apparatus also includes a surface-sensing information board for providing an environment for the surface-sensing information of the object to allow a user to perceive the surface-sensing information of the object. The apparatus further includes a surface-sensing information reproducer for reproducing the tactile sensing information and the accompanying sensing information of the object to be sensed by the user.06-11-2009
20110260967HEAD MOUNTED DISPLAY - In a head mounted display, when it is determined that a hand of a user is in a field angle, the head mounted display starts the playing of main manual information. When a standard time elapses after starting the playing of the main manual information without the determination that the hand of the user is not included in the field angle, the playing of the main manual information is switched to the playing of sub manual information. When it is determined that the hand of the user is not included in the field angle in a state where the playing of the main manual or the sub manual is underway, the playing of the main manual information or the sub manual operation underway is finished.10-27-2011
20090015548Image projection system - Provided is an image projection system including a screen, an input terminal, an image processing unit, an image projector, and invisible light ray-shielding member, characterized in that: the screen has a pattern-printed sheet having reflection patterns for transmitting positional information by reflecting invisible light rays or absorption patterns for transmitting positional information by absorbing invisible light rays; the input terminal has an invisible light ray-applying portion, detects a reflected light ray of an invisible light ray, which is applied from the invisible light ray-applying portion and reflected from a specific site of the pattern-printed sheet, reads positional information of any one of the reflection patterns or any one of the absorption patterns, and outputs the positional information to the image processing unit; the image processing unit converts the positional information input from the input terminal into image information A, and transfers the image information A to the image projector; the image projector converts the image information A transferred from the image processing unit into visible light rays, and projects the visible light rays on the screen; and the invisible light ray-shielding means is placed in front of or inside the image projector, and removes the invisible light ray from the visible light rays to be projected. The present invention can provide the image projection system in which, even when a screen is large, the positional information of the screen can be simply input in a non-contact fashion with high accuracy, and image information converted from the input positional information can be further converted into visible light rays to be projected.01-15-2009
20080316170Computer Mouse - Provided is a computer mouse including a grip portion having a flat bottom surface and a longitudinal central grip axis; a grip central point located at the center of the longitudinal central grip axis; the grip portion standing on a surface on which the computer mouse moves; a vertical axis perpendicular to the surface and including the central point; a sensor portion which includes a sensor having a sensor central point located at the center of the sensor; the sensor being located distant in a forward direction from the grip portion; a vertical plane containing the vertical axis and the sensor central point; wherein the longitudinal central axis is angled from the vertical axis and tilted to the left or right with reference to the forward direction; a bottom grip point defined by the intersection between the vertical axis and the bottom surface; and wherein the bottom grip point is located substantially at the center of the bottom surface.12-25-2008
20080303788Input system and input apparatus - An input system executes an input to an information processing apparatus depending on the hand motion. Plural myoelectric sensors are provided on an area between a wrist of the person and bases of a second finger to a fifth finger, and detect myoelectric signals depending on the hand motion. A setting portion outputs at least one command to make the person execute at least one particular motion in a state where the myoelectric sensors are worn on the hand, and associates the detected myoelectric signals after the output of the command with the particular motion corresponding to the command. An input portion identifies the hand motion from myoelectric signals detected based on the hand motion after the termination of the association and the association result of the setting portion, and executes the input to the information processing apparatus depending on the identified hand motion.12-11-2008
20100117957Remote control apparatus for vehicle - A remote control apparatus mountable on a vehicle includes an operation input device operable to point a position in a predetermined operation area, a lighting device for illuminating the operation input device, and a controller for controlling a brightness of the lighting device. The remote control apparatus has a self-diagnostic mode for diagnosing a fault in the operation input device. The controller controls the brightness of the lighting device such that the lighting device produces a illumination pattern corresponding to the position pointed by the operation input device, when the self-diagnostic mode is set.05-13-2010
20100277411USER TRACKING FEEDBACK - Technology is presented for providing feedback to a user on an ability of an executing application to track user action for control of the executing application on a computer system. A capture system detects a user in a capture area. Factors in the capture area and the user's actions can adversely affect the ability of the application to determine if a user movement is a gesture which is a control or instruction to the application. One example of such factors is a user being out of the field of view of the capture system. Some other factor examples include lighting conditions and obstructions in the capture area. Responsive to a user tracking criteria not being satisfied, feedback is output to the user. In some embodiments, the feedback is provided within the context of an executing application.11-04-2010
20080266250METHOD AND APPARATUS FOR DYNAMICALLY ADJUSTING GAME OR OTHER SIMULATION DIFFICULTY - A method for use with a simulation includes running the simulation, receiving information from a control interface used by a user to interact with the simulation, analyzing the received information, forming at least an indication of the user's level of skill based on the analysis of the received information, and adjusting a difficulty level of the simulation based on the indication of the user's level of skill. A storage medium storing a computer program executable by a processor based system and an apparatus for use with a simulation are also disclosed.10-30-2008
20080266248COORDINATE INFORMATION PROVIDING METHOD AND VIDEO APPARATUS THEREOF - A method for providing coordinate information to an external device and a video apparatus thereof. The coordinate information providing method includes receiving coordinate information input by a user from an input device; and transmitting a coordinate information delivery message containing the coordinate information input through the input device, to an external device connected according to a High Definition Multimedia Interface (HDMI) Consumer Electronics Control (CEC) specification. Accordingly, the video apparatus can control the external device by transferring the coordinate information to the external device.10-30-2008
20080266249Medical overlay mirror - Medical overlay mirror methods and related systems.10-30-2008
20080266246TRAVERSING GRAPHICAL LAYERS USING A SCROLLING MECHANISM IN A PHYSICAL DESIGN ENVIRONMENT - Z-axis display navigation in the design automation process of physical design, development and manufacturing of integrated circuits includes pre-selecting, using a computer mouse connected to a computer workstation processor, viewable graphical layout layers desired by a design layout debugging operator to view during an integrated circuit (IC) design debugging process. After selecting the desired viewable graphic layout layers, the layout operator uses the mouse to traverse the pre-selected viewable graphical layout layers displayed, by changing the input scrolling pattern of the mouse to scroll forward, backward, diagonally and from side to side in a plurality of directions, where a curser on the layout screen will correspondingly move in the plurality of directions in the pre-selected viewable graphical layout layers on the layout screen corresponding to the movement of the mouse by the operator.10-30-2008
20120229375DISPLAY DEVICE, DISPLAY METHOD AND RECORDING MEDIUM - For a book data display device of the present invention, the user performs an operation similar to an action performed on a real book when searching for a supplement held between pages of the book. At this time, in the book data display device of the present invention, a movement detection section detects the movement of the display device, and a display control section displays the image of a supplement in a display section on the basis of the detection result of the movement detection section.09-13-2012
20100117953Hand-worn interface device - A Hand-Worn Ambidextrous Interface Device for use with interfacing with a computer or similar device includes in some preferred embodiments a Housing, an Angled Face, a plurality of Switches and a Removable Three-Axis Joystick. In some preferred embodiments the Removable Three-Axis Joystick may control the cursor on the monitor or screen of the computer or related device to which it is communicating.05-13-2010
20100123657INFORMATION PROCESSING APPARATUS, PROCESSING METHOD THEREOF, AND COMPUTER-READABLE STORAGE MEDIUM - An information processing apparatus executes a variety of processing operations in accordance with a user's operation detected by an operation detection device. The apparatus generates a data set from a data group in accordance with a predetermined condition. The apparatus determines the content of processing corresponding to the motion detected by the operation detection device. The apparatus adjusts the data set by increasing or decreasing data included in the data set generated, based on the determined processing.05-20-2010
20100127968MULTI-PROCESS INTERACTIVE SYSTEMS AND METHODS - A multi-process interactive system is described. The system includes numerous processes running on a processing device. The processes include separable program execution contexts of application programs, such that each application program comprises at least one process. The system translates events of each process into data capsules. A data capsule includes an application-independent representation of event data of an event and state information of the process originating the content of the data capsule. The system transfers the data messages into pools or repositories. Each process operates as a recognizing process, where the recognizing process recognizes in the pools data capsules comprising content that corresponds to an interactive function of the recognizing process and/or an identification of the recognizing process. The recognizing process retrieves recognized data capsules from the pools and executes processing appropriate to contents of the recognized data capsules.05-27-2010
20100127970Information processing system and information processing method - An information processing system includes: a sensor unit that detects the three-dimensional position of an object according to variations of electrostatic capacitances; and a control unit that performs display dependent on the detected three-dimensional position at a position on a display unit determined with positions in directions orthogonal to a direction of separation in which the object and the sensor unit are located at a distance from each other.05-27-2010
20090213066ONE BUTTON REMOTE CONTROL WITH HAPTIC FEEDBACK - An input system for a TV remote control or other system has a single touch surface with a deformable haptic assembly below the touch surface such that a user placing a finger on the touch surface can feel deformation of the haptic assembly. A pressure sensing assembly is below the haptic assembly and sensing motion of a finger on the touch surface, with a processor receiving input from the pressure sensing assembly and providing output to the haptic assembly in response. Also, a display receives input sent by the processor in response to input from the pressure sensing assembly to cause the display to present a changing image of a keypad as a user moves a finger on the touch surface.08-27-2009
20100079370Apparatus and method for providing interactive user interface that varies according to strength of blowing - An apparatus includes an interactive user interface that varies according to strength of blowing. A sensor detects blowing from a user and outputs a strength value of the detected blowing. A memory stores interactive image data capable of giving an effect of responding to user's blowing strength in real time. A controller loads interactive image data corresponding to the strength value of the blowing from the memory, and configures an interactive image that varies according to the strength value of the blowing, using the loaded interactive image data. A display displays the configured interactive image under control of the controller.04-01-2010
20100085301Bendable electronic interface external control system and method - A system includes, but is not limited to: obtaining and one or more application information sending modules configured to direct sending one or more application related information portions to the bendable electronic interface based upon the obtaining of the first information. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure.04-08-2010
20100085300Bendable electronic interface external control system and method - A method includes, but is not limited to: obtaining first information regarding one or more positions of one or more portions of one or more regions of a bendable electronic interface and sending one or more application related information portions to the bendable electronic interface based upon the obtaining of the first information. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure.04-08-2010
20090207132DISPLAY DEVICE, IMAGE FEEDING DEVICE, DISPLAY SYSTEM, PROGRAM, AND INFORMATION STORAGE MEDIUM - A display device includes: a mute deciding unit that decides whether mute designation for an image has been changed; a displaying side communication unit that receives image information from an image feeding device and that if the mute designation has been changed from invalidation to validation, transmits suspension-of-feed instructing information to the image feeding device; and a displaying side display unit that displays an image on the basis of the image information and that if the mute designation has been changed from invalidation to validation, ceases display of the image.08-20-2009
20100079369Using Physical Objects in Conjunction with an Interactive Surface - An interaction management module (IMM) is described for allowing users to engage an interactive surface in a collaborative environment using various input devices, such as keyboard-type devices and mouse-type devices. The IMM displays digital objects on the interactive surface that are associated with the devices in various ways. The digital objects can include input display interfaces, cursors, soft-key input mechanisms, and so on. Further, the IMM provides a mechanism for establishing a frame of reference for governing the placement of each cursor on the interactive surface. Further, the IMM provides a mechanism for allowing users to make a digital copy of a physical article placed on the interactive surface. The IMM also provides a mechanism which duplicates actions taken on the digital copy with respect to the physical article, and vice versa.04-01-2010
20100079371TERMINAL APPARATUS, DISPLAY CONTROL METHOD, AND DISPLAY CONTROL PROGRAM - A terminal apparatus includes a display displaying a plurality of display elements representing options on a display screen, an image-capturing section capturing an image of an operator who is viewing the display screen, a face position detector detecting the position of a facial image of the operator in a captured image, and a controller controlling the display to move the plurality of display elements in a predetermined direction on the display screen and to sequentially update and display the display elements when it is detected that the facial image of the operator in the captured image is outside of a predetermined range, and to stop the movement of the plurality of display elements when it is detected that the facial image falls within the predetermined range.04-01-2010
20100090946System and Method for Gesture Based Control System - The system provides a gestural interface to various visually presented elements, presented on a display screen or screens. A gestural vocabulary includes ‘instantaneous’ commands, in which forming one or both hands into the appropriate ‘pose’ results in an immediate, one-time action; and ‘spatial’ commands, in which the operator either refers directly to elements on the screen by way of literal ‘pointing’ gestures or performs navigational maneuvers by way of relative or “offset” gestures. The system contemplates the ability to identify the users hands in the form of a glove or gloves with certain indicia provided thereon, or any suitable means for providing recognizable indicia on a user's hands or body parts. A system of cameras can detect the position, orientation, and movement of the user's hands and translate that information into executable commands.04-15-2010
20130021238SYSTEMS AND METHODS FOR ELECTRONIC DISCOVERY - A system for electronic discovery may comprise an electronic discovery platform; a two-handed controller configured to produce a plurality of signals; and a memory configured to store at least one set of controller signal relationships, the controller signal relationships being associated with the two-handed controller. A system for electronic discovery may further comprise an interface application communicatively coupled to the memory, wherein the interface application uses the at least one set of controller signal relationships to associate at least one of the plurality of signals from the two-handed controller with at least one of a plurality of discovery commands associated with the discovery of electronic data, and wherein the interface application communicates a determined discovery command such that the discovery command is executed by the electronic discovery platform.01-24-2013
20110267259RESHAPABLE CONNECTOR WITH VARIABLE RIGIDITY - An accessory is disclosed for use in a human-computer interface gaming or other application. The accessory can be held or otherwise interacted with by a user, where the accessory is sensed and displayed as a virtual object on a display. The virtual representation of the accessory may be an accurate recreation of the accessory, or it may be displayed as a virtual object appropriate to the gaming or other application. The real world accessory is flexible and may be morphed into a variety of forms to better suit a wide array of uses. This accessory may also serve as a platform upon which other accessories can be mounted to enhance user experience.11-03-2011
20100123656METHOD AND DEVICE FOR INPUTTING FORCE INTENSITY AND ROTATION INTENSITY BASED ON MOTION SENSING - Provided is an input device for operating in a three-dimensional space and inputting user instructions. The input device includes a first operation unit that calculates a first rotation angle in a coordinate system independent of the attitude of the device based on the output value of a first sensor, a second operation unit that calculates a second rotation angle in the coordinate system based on the output value of a second sensor, an attitude angle measuring unit that calculates the attitude angle of the input device by combining the first rotation angle and the second rotation angle, and an intensity calculation unit that calculates force intensity in the coordinate system using acceleration of the input device and the attitude angle of the input device obtained in the attitude measuring unit.05-20-2010
20130021236Orientation Based Application Launch System - An electronic device may include multiple faces, an application launch input element, a memory that stores multiple applications, and a processor that accesses the memory. In response to a detected trigger of the application launch input element, the processor determines the orientation of the device. For example, the processor may determine which face of the electronic device is pointed in a predetermined direction. Based on the determined orientation of the device, the processor selects and activates a specific application from the multiple available applications.01-24-2013
20090244002Method, Device and Program for Controlling Display, and Printing Device - A display control method for displaying a screen which receives a user-selected option from a predetermined option group through predetermined option presentation, the method includes: detecting a face image area which at least includes a user face in an image area; and determining an initial position of the predetermined option presentation in accordance with the face image area.10-01-2009
20090278792Identifying User by Measuring Pressure of Button Presses on User Input Device - In one embodiment, a method comprises receiving, by a user identifier circuit, a button pressure signature specifying a sequence of button pressure values sampled while a corresponding identified button of a user input device is pressed by a user; the user identifier circuit identifying the user of the user input device based on the button pressure signature; and the user identifier circuit outputting a message identifying the identified button and the identified user.11-12-2009
20090201246Motion Compensation for Screens - A method for compensating for motion on screens is provided. In one embodiment, the method includes varying the display of a screen on a device using motion data. In this embodiment, a display adjustment amount also may be determined using screen properties and motion limits. In another embodiment, the method includes varying the location of an input region on a touch screen using touch data. In yet another embodiment, the method includes scaling selectable images and apportioning the display using motion data. Various additional methods, machine-readable media, and systems for motion compensation of a screen are also provided.08-13-2009
20110169728ROTATABLE DISPLAY DEVICE AND IMAGE DISPLAYING METHOD THEREOF - A rotatable display device and an image displaying method thereof are provided. In the present method, a plurality of display signal formats is defined in the rotatable display device, wherein each of the display signal formats corresponds to a display direction of the rotatable display device. Then, one of the display signal formats is provided to a host according to the display direction of the rotatable display device. Finally, the rotatable display device displays an image signal which is directly outputted according to the display signal format received by the host.07-14-2011
20110169727INTERACTIVE INPUT SYSTEM AND ILLUMINATION SYSTEM THEREFOR - An interactive input system includes at least one illumination source emitting radiation into a region of interest; at least one imaging assembly capturing image frames of the region of interest, the at least one illumination source being in the field of view of the at least one imaging assembly; and a controller communicating with the at least one illumination source, the controller controlling the intensity of radiation emitted by the at least one illumination source during image frame capture.07-14-2011
20090284463INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, INFORMATION PROCESSING PROGRAM, AND MOBILE TERMINAL - An information processing apparatus includes a tap operation detecting unit configured to detect the number of tap operations for tapping a housing and a tapped position of the housing, a storage unit storing a plurality of application programs, an activated application table storing an application program to be activated in association with the tapped position of the housing and the number of tap operations, and a control unit configured to detect an application program corresponding to the tapped position and the number of tap operations with reference to the activated application table on the basis of the tapped position and the number of tap operations detected by the tap operation detecting unit, to read out the detected application program from the storage unit, and to activate the detected application program.11-19-2009
20080211767SYSTEM FOR PRINTING CODED DATA PROVIDING INTERACTION WITH COMPUTER SOFTWARE - A system for enabling user interaction with computer software. The system includes a printer for receiving print data, printing a form, using the print data, with information related to an interactive element coincident with coded data indicative of the interactive element, receiving indicating data from a sensing device which is generated by the sensing device sensing the coincident coded data so as to be indicative of the interactive element, and transfer the indicating data to a computer system to allow the interaction to be interpreted. The coded data is indicative of an identity. The computer system determines, using the indicating data, the identity, determines, using the identity, a page description, and identifies, using the page description, the interactive element.09-04-2008
20090085866Image display apparatus - An image display apparatus includes an input unit which has a display on which an image is displayed, a flexible sheet typed substrate, and a bending detection section which is arranged on a surface of the substrate to detect a bending deformation of the substrate, and a display control section which controls to change an image to be displayed on the display, based on the bending deformation of the substrate detected by the bending detection section. Accordingly, it is possible to provide an image display apparatus which is capable of changing easily an image to be displayed on the display even when a user is not good at operating an equipment.04-02-2009
20110215998PHYSICAL ACTION LANGUAGES FOR DISTRIBUTED TANGIBLE USER INTERFACE SYSTEMS - A system and a method are disclosed for a software configuration for use with distributed tangible user interfaces, in which the software is manipulated via a set of individual actions on individual objects, and in which such individual actions across one or more objects may be combined, simultaneously and/or over time, resulting in compound actions that manipulate the software. These actions and compound actions may be interpreted and acted upon by the software differently depending on its design, configuration, and internal state.09-08-2011
20110169726EVOLVING UNIVERSAL GESTURE SETS - In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. Gesture recognition data, used to recognize gestures from captured data representative of a user's input gestures, may be evolved based on captured data from a plurality of users. A common set or default set of gesture recognition data may be evolved by selecting a plurality of users for tracking. Captured data of the plurality of users may be processed to identify input gesture data for the plurality of users, and the gesture recognition data may be evolved based on features of the input gesture data that is common to multiple users. The evolved gesture recognition data may be implemented not only for the users tracked, but for users not tracked. An identifier may identify when the evolved gesture recognition data applies and implement the evolved gesture recognition data when the identifier is present.07-14-2011
20110169725FUNCTION MEASURING DEVICE - [PROBLEM TO BE SOLVED] To provide a newly function measurement apparatus capable of measuring a function of a person while a test subject moves a whole body thereof.07-14-2011
20090289892SIMULATION OF WRITING ON GAME CONSOLES THROUGH THE USE OF MOTION-SENSING TECHNOLOGY - A method and system of utilizing a game console with motion sensing technology is provided. The present invention, in various implementations, provides for a method for generating one or more symbols in response to one or more gestures using an input device of a gaming system. The method comprises providing the input device being capable of generating one or more gesture signals in response to one or more gestures and being operable to select a mode of one or more operational states. The method also provides for generating one or more gesture signals corresponding to the one or more gestures, respectively; mapping the one or more generated gesture signals in relation to one or more symbols, respectively; and, transmitting the one or more symbols corresponding to the respective one or more gesture signals to an output.11-26-2009
20090273561DEVICE CONTROL SYSTEM - A device control system allows a user-owned terminal to control a control providing device. A control requester in the terminal sends a terminal message including coordinate information of a display screen and an identifier of the terminal. In response, a control request response processor in the control providing device assigns user-controllable events to relative positions of coordinates recognized from the coordinate information, thus sending event detail information to the terminal. A display controller in the terminal displays on the display screen events in a display mode based on the event detail information. A control request processor in the terminal sends a control request message to the control providing device in response to a user operation of the events displayed on the display screen. A function executing unit in the control providing device receives the control request message and executes a function corresponding thereto.11-05-2009
20110199291GESTURE DETECTION BASED ON JOINT SKIPPING - A system is disclosed for detecting or confirming gestures performed by a user by identifying a vector formed by non-adjacent joints and identifying the angle the vector forms with a reference point. Thus, the system skips one or more intermediate joints between an end joint and a proximal joint closer to the body core of a user. Skipping one or more intermediate joints results in a more reliable indication of the position or movement performed by the user, and consequently a more reliable indication of a given gesture.08-18-2011
20110199295Human Interface Input Acceleration System - A method and system for transmitting data to and from a hand-held host device are disclosed. An accessory device for interfacing with a host device includes a communication channel designed to establish a bidirectional data link between the accessory device and the host device. The accessory device also includes a storage unit communicatively coupled to the communication channel. The storage unit is designed to store various data. In addition, at least a first data is selectively transmitted from the stored data of the accessory device to the host device through the established bidirectional data link.08-18-2011
20090278794Interactive Input System With Controlled Lighting - An interactive input system comprises at least one imaging device capturing images of a region of interest, a plurality of radiation sources, each providing illumination to the region of interest and a controller coordinating the operation of the radiation sources and the at least one imaging device to allow separate image frames based on contributions from different radiation sources to be generated.11-12-2009
20090278795Interactive Input System And Illumination Assembly Therefor - An illumination assembly for an interactive input system comprises at least two proximate radiation sources directing radiation into a region of interest, each of the radiation sources having a different emission angle.11-12-2009
20090278791MOTION TRACKING SYSTEM - A motion tracking system for tracking an object composed of object parts in a three-dimensional space. The system comprises a number of magnetic field transmitters; a number of field receivers for receiving the magnetic fields of the field transmitters; a number of inertial measurement units for recording a linear acceleration; a number of angular velocity transducers for recording angular velocities. The system further comprises a processor for controlling the transmitters and receiving signals coming from the field receivers and the inertial measurement unit; which processor contains a module for deriving orientation and/or position information of the constituent object parts of the object on the basis of the received signals. The processor is configured for intermittently controlling the transmitters transmit at a predetermined frequency, wherein the position and/or orientation information is derived by periodically calibrating the motion information coming from the inertial measurement unit with the motion information coming from the magnetic field receivers.11-12-2009
20090284464PROJECTION IMAGE DISPLAY APPARATUS - The projection image display apparatus includes an image light generator and a projection optics. The projection optics includes a reflection mirror. The projection image display apparatus includes an image capture device configured to capture an image of a user facing the projection surface, a first acquisition unit configured to acquire captured image data from the image capture device, a second acquisition unit configured to acquire sample data independently of the captured image data, and an image controller configured to control an image to be displayed on the projection surface on the basis of the captured image data and the sample data.11-19-2009
20090289894ELECTRONIC APPARATUS AND THREE-DIMENSIONAL INPUT DEVICE THEREOF - An electronic apparatus and a three-dimensional input device thereof are disclosed. The electronic apparatus includes a display and the three-dimensional input device. The three-dimensional input device includes an infrared ray emitting/receiving unit, a cursor control unit and a radio frequency receiving unit. The infrared ray emitting/receiving unit is electrically connected to the display and includes an infrared ray emitter and an infrared ray receiver. The cursor control unit includes a reflective material and a radio frequency emitter. The radio frequency emitter is electrically connected to the display.11-26-2009
20110169730SIGHT LINE INPUT USER INTERFACE UNIT, USER INTERFACE METHOD, USER INTERFACE PROGRAM, AND RECORDING MEDIUM WITH USER INTERFACE PROGRAM RECORDED - A sight line input user interface unit, a sight line input user interface method, a sight line input user interface program, and a recording medium with the program recorded in which the user's intention can be properly recognized to prevent a false judgment are provided. The sight line input user interface unit includes information display elements (07-14-2011
20090262070Method and System for Reducing Effects of Undesired Signals in an Infrared Imaging System - Effects of undesired infrared light are reduced in an imaging system using an infrared light source. The desired infrared light source is activated and a first set of imaging data is captured during a first image capture interval. The desired infrared light source is then deactivated, and a second set of image data is captured during a second image capture interval. A composite set of image data is then generated by subtracting from first values in the first set of image data corresponding second values in the second set of image data. The composite set of image data thus includes a set of imaging where data all infrared signals are collected, including both signals resulting from the IR source and other IR signals, from which is subtracted imaging in which no signals result from the IR course, leaving image data including signals resulting only from the IR source.10-22-2009
20090262069GESTURE SIGNATURES - Apparatus, systems, and methods may operate to present viewable content to a viewer on a display screen, receive a transmitted signature from a user interface device (UID) associated with the display screen (wherein the signature results from at least one gesture initiated by the viewer and detected by the UID), and compare the transmitted signature to a stored signature associated with a known individual to determine whether an identity associated with the viewer matches an identity associated with the known individual. Additional apparatus, systems, and methods are disclosed.10-22-2009
20090262068SPACE EFFICENT SORTABLE TABLE - A sortable and space efficient graphical user interface and a system for the efficient display of sortable data are disclosed herein. The graphical user interface may include at least one column, at least one row and a data cell defined by the intersection of at least one column and at least one row. First and second data may be displayed in the data cell. A first header is associated with the first column and identifies the first data. A second header is associated with the first column and identifies the second data. In the system for displaying sortable data, a graphical user interface is displayed upon a graphical display. A table is displayed as at least a portion of the graphical user interface, the table having a column with a plurality of rows, each row displaying first and second data and a first header associated with a first data and a second header associated with the second data.10-22-2009
20090295712PORTABLE PROJECTOR AND METHOD OF OPERATING A PORTABLE PROJECTOR - A portable projector comprising a projection unit for projecting an image onto a projection surface. The portable projector comprises a stabilization unit for stabilizing the projecting of the image. The portable projector further comprises a light sensing unit for scanning a region surrounding the portable projector.12-03-2009
20090295714POWER CONSERVING SYSTEM FOR HAND-HELD CONTROLLERS - A manual controller operates through a wireless communication link with a computing device to manipulate images or symbols on a display associated with the computing device. An electrical power conserving system allows such a wireless controller to conserve electrical power as the controller operates with electrical power supplied by replaceable batteries or rechargeable battery packs. In preferred embodiments, electronic manual-contact sensing circuitry enables more rapid turnoff of the controller during periods of game play inactivity. This eliminates a long timeout period and allows electrical current drain only when the controller is actually being held by a user. Preferred embodiments of the electronic manual-contact sensing circuitry detect electrical resistance of a user's hands and thereby enables delivery of different amounts of electrical power as required.12-03-2009
20090179854DYNAMIC INPUT GRAPHIC DISPLAY - An input device for providing dynamic displays is disclosed. The input device can modify the appearance and/or location of graphics associated with an input area of a device. For example, the input device can have a button layout that shifts based on the orientation of the electronic device relative to the user, such that the button layout is consistently presented to the user in an upright orientation. The input device can rotate and/or rename a button input area region depending on the context of an application running on the electronic device. The input device can display dynamic graphic content in an input area which is distinct from a display screen of the electronic device.07-16-2009
20110221667APPARATUS AND METHOD FOR SWITCHING SCREEN IN MOBILE TERMINAL - An apparatus and a method for switching an output screen direction of a mobile terminal from a vertical direction to a horizontal direction, or from the horizontal direction to the vertical direction when needed are provided. The apparatus includes a sensor unit and a rotation determining unit. The sensor unit detects a direction of a magnetic field. The rotation determining unit determines a movement of the mobile terminal that rotates on a plane through the direction of the magnetic field detected by the sensor unit.09-15-2011
20120293408TRACKING BIMANUAL MOVEMENTS - Hands may be tracked before, during, and after occlusion, and a gesture may be recognized. Movement of two occluded hands may be tracked as a unit during an occlusion period. A type of synchronization characterizing the two occluded hands during the occlusion period may be determined based on the tracked movement of the occluded hands. Based on the determined type of synchronization, it may be determined whether directions of travel for each of the two occluded hands change during the occlusion period. Implementations may determine that a first hand and a second hand are occluded during an occlusion period, the first hand having come from a first direction and the second hand having come from a second direction. The first hand may be distinguished from the second hand after the occlusion period based on a determined type of synchronization characterizing the two hands, and a behavior of the two hands.11-22-2012
20120086632ELECTRIC POWER INFORMATION USER INTERFACE, DISPLAY METHOD AND APPARATUS THEREOF - Disclosed herein relates to an electric power information user interface, an information display method and an apparatus thereof. The interface is used to display the electricity consumption to be detected. The apparatus may display diverse power information through several functional combo keys. The interface is particularly implemented by software. The interface includes a value-display area, a parameter-type area with several changeable symbols or texts, a parameter-unit area indicative of unit of the value, a statistics-mode area for providing inquiry into the electric power information at different period zones, and a functional-key area. The functional-key area has several multi-functional keys and an energy-count key that is used to generate a reset and a stop command by repeating triggering. The energy-count key particularly provides the electric power information within a certain period.04-12-2012
20110205151Methods and Systems for Position Detection - A computing device, such as a desktop, laptop, tablet computer, a mobile device, or a computing device integrated into another device (e.g., an entertainment device for gaming, a television, an appliance, kiosk, vehicle, tool, etc.) is configured to determine user input commands from the location and/or movement of one or more objects in a space. The object(s) can be imaged using one or more optical sensors and the resulting position data can be interpreted in any number of ways to determine a command, including 2-dimensional and 3-dimensional movements with or without touch.08-25-2011
20130100008Haptic Response Module - Embodiments provide an apparatus that includes a tracking sensor to track movement of a hand behind a display, such that a virtual object may be output via a display, and a haptic response module to output a stream of gas based a determination that the virtual object has interacted with a portion of the image.04-25-2013
20130100009MULTI-USER INTERACTION WITH HANDHELD PROJECTORS - A device-mounted infrared (IR) camera and a hybrid visible/IR light projector may be used to project visible animation sequences along with invisible (to the naked eye) tracking signals, (e.g. a near IR fiducial marker). A software (or firmware) platform (either on the handheld device or to which the device communicates) may track multiple, independent projected images in relation to one another using the tracking signals when projected in the near-IR spectrum. The resulting system allows a broad range of new interaction scenarios for multiple handheld projectors being used in any environment with adequate lighting conditions without requiring the environment to be instrumented in any way.04-25-2013
20120268363IMAGE PROCESSING APPARATUS, IMAGE PROCESSING SYSTEM, IMAGE PROCESSING METHOD, AND COMPUTER READABLE MEDIUM - An image processing apparatus includes an imaged image acquiring unit, an identifying unit, an acquiring unit, and a display control unit. The imaged image acquiring unit acquires an imaged image. The identifying unit identifies an object imaged in the imaged image. The acquiring unit acquires image information on the object stored in association with the object identified by the identifying unit. The display control unit displays at least part of an image area of an object image based on the image information on the object.10-25-2012
20120268364FAST FINGERTIP DETECTION FOR INITIALIZING A VISION-BASED HAND TRACKER - Systems and methods for initializing real-time, vision-based hand tracking systems are described. The systems and methods for initializing the vision-based hand tracking systems image a body and receive gesture data that is absolute three-space data of an instantaneous state of the body at a point in time and space, and at least one of determine an orientation of the body using an appendage of the body and track the body using at least one of the orientation and the gesture data.10-25-2012
20120268362LASER DIODE MODES - Laser diode mode techniques are described. In one or more implementations, one or more laser diodes of a computing device are caused to operate below a lasing threshold to illuminate at least part of a physical surroundings of the computing device. One or more images of the illuminated physical surroundings are captured by a camera of the computing device and one or more inputs are recognized from the captured one or more images for interaction with a user interface displayed by the computing device.10-25-2012
20120293407HEAD MOUNTED DISPLAY DEVICE AND IMAGE DISPLAY CONTROL METHOD THEREFOR - A Head Mounted Display (HMD) device and an image display control method are disclosed. The device includes a display unit including a left display and a right display for a left eye and a right eye for displaying images for the left eye and the right eye, a vital reaction sensor unit including a first vital reaction sensor for the left eye and a second vital reaction sensor for the right eye, detecting vital reaction changes of a user viewing the left display and the right display, and generating, when a vital reaction change is detected, an interruption signal including coordinates of a position at which the vital reaction change is detected, and a control unit for outputting images for the left eye and the right eye to the display unit.11-22-2012
20120293409MOBILE DEVICE AND DISPLAY CONTROL METHOD - According to an aspect, a mobile device includes an input unit, a display unit, a storage unit, and a control unit. The input unit detects operation input. The display unit displays a standby screen. The storage unit stores a plurality of objects to be displayed on the standby screen. The objects are associated with at least one of shortcut information or character information and being allocated with group information for classifying the objects. The control unit for sets on the standby screen a plurality of divided regions to be divided on a group-by-group basis and causes the plurality of objects to be displayed in the divided regions corresponding to the respective groups to which the objects belong.11-22-2012
20120293406METHOD AND APPARATUS FOR PROCESSING INPUT IN MOBILE TERMINAL - A method for processing an input in a mobile terminal that recognizes inputs such as a visual input and/or voice input is provided. In the method, a position on a screen corresponding to a viewing direction of a user's eyes is determined. An indicator corresponding to the determined position is displayed on the screen. When a relevant (i.e. actuating) signal is detected, an indicator is arranged on the desired object for selection.11-22-2012
20090085865DEVICE FOR UNDERWATER USE AND METHOD OF CONTROLLING SAME - A method of controlling a device for underwater use includes detecting a user-interaction with the electronic device based on signals received from an accelerometer, and performing an operation as a result of the user-interaction.04-02-2009
20110205147Interacting With An Omni-Directionally Projected Display - Concepts and technologies are described herein for interacting with an omni-directionally projected display. The omni-directionally projected display includes, in some embodiments, visual information projected on a display surface by way of an omni-directional projector. A user is able to interact with the projected visual information using gestures in free space, voice commands, and/or other tools, structures, and commands. The visual information can be projected omni-directionally, to provide a user with an immersive interactive experience with the projected display. The concepts and technologies disclosed herein can support more than one interacting user. Thus, the concepts and technologies disclosed herein may be employed to provide a number of users with immersive interactions with projected visual information.08-25-2011
20110199294DEVICES, SYSTEMS AND METHODS OF CAPTURING AND DISPLAYING APPEARANCES - Some demonstrative embodiments of the invention include systems, devices and/or methods enabling appearance comparison. The system, according to some demonstrative embodiments, may include at least one interactive imaging and display station. The station may include, for example, a mirror-display device capable of selectably operating in either or both a mirror mode or a display mode; an imaging device to capture one or more appearances appearing in a field of view in front of the mirror-display device; and/or an image control unit to select the mode of operation of the mirror-display device according to a user command. Other embodiments are described and claimed.08-18-2011
20110199292Wrist-Mounted Gesture Device - A wrist-mounted gesture device, system, and method is disclosed. The wrist-mounted gesture device includes at least one accelerometer adapted to detect acceleration caused by one or more gestures of the user. The accelerometer provides data to a microcontroller which is adapted to interpret the gesture data and match it with corresponding predefined gestures. The device includes wireless connection circuitry which allows the device to be wirelessly interfaced with an electronic device. The electronic device may be a device within the living space or environment of the user. A highly effective gesture system is ideally utilized in order to produce accurately recognizable gestures, in either one, two, or three dimensions. In certain embodiments, movements corresponding to movements toward and away from numbers in a standard keypad arrangement can be used, and vertical and horizontal movements corresponding to affirmative and negative gestures are used. In other embodiments, the device can be used in a mapping system of a room or interior space to assist users in finding objects or locations in such a room. In still other embodiments, the device can be used to assess tremors in a patient.08-18-2011
20110199290DIGITAL SIGNS - A method for pairing a control device with a digital sign is provided. The method includes receiving control device geometric attributes and digital sign geometric attributes, determining a digital sign identification based on the control device geometric attributes and the digital sign geometric attributes, and transmitting the digital sign identification to the control device. The control device geometric attributes may define geometric attributes of the control device. The digital sign geometric attributes may define geometric attributes of the digital sign that the control device is attempting to control. The digital sign identification may define the digital sign that the control device.08-18-2011
20100103095INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, AND CONTROL METHOD - An input apparatus, a control apparatus, a control system, and a control method that are capable of making a movement of a pointer on a screen a natural movement that matches an intuition of a user are provided. An input apparatus includes a casing, an acceleration sensor, and an angular velocity sensor. The acceleration sensor detects an acceleration value of the casing in a first direction. The angular velocity sensor detects an angular velocity about an axis in a second direction different from the first direction. Instead of calculating a velocity value of the casing by simply integrating the detected acceleration value, the velocity value of the casing in the first direction is calculated based on the acceleration value and the angular velocity value that have been detected. As a result, a highly-accurate calculation of the velocity value of the casing becomes possible, and a movement of a pointer on a screen becomes a natural movement that matches a sense of a user based on a displacement corresponding to the velocity value.04-29-2010
20100103094Input Device, Input Control Method, Information Recording Medium, and Program - In an input device (04-29-2010
20100103093INFORMATION INPUTTING DEVICE, INFORMATION OUTPUTTING DEVICE AND METHOD - Two bar support portions are provided on opposite sides of a floor mat sensor and a horizontal bar is fixed between these portions to define spaces below into which feet are to be inserted. Then, using the horizontal bar as a reference, an input area, such as “A” or “B”, can be accurately stepped on. A signal receiver reads and stores a first signal, and a signal determination unit determines whether the signal that was read was generated by stepping on area A or area B. Thereafter, when a signal is received indicating a data type was received first, the signal receiver reads the next input signal and the signal determination unit determines whether a signal indicating the data type was received.04-29-2010
20100103092Video-based handwritten character input apparatus and method thereof - A video-based character input apparatus includes an image capturing unit, an image processing unit, a one-dimensional feature coding unit, a character database, a character recognizing unit, a display unit, and a stroke feature database. The image capturing unit captures an image. The image processing unit filters a moving track of a fingertip in the picture by detecting a graphic difference, then detecting a skin color and picking out a moving track most corresponding to a point of the object. The one-dimensional feature coding unit takes a stroke with respect to the moving track and converts the stroke into a coding sequence in a one-dimensional string according to a time sequence. The character recognizing unit proceeds with character comparison between the coding sequence in a one-dimensional string and the character database to find out a character having the most similarity for display one the display unit.04-29-2010
20100060570Control System for Navigating a Principal Dimension of a Data Space - Systems and methods are described for navigating through a data space. The navigating comprises detecting a gesture of a body from gesture data received via a detector. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting comprises identifying the gesture using the gesture data. The navigating comprises translating the gesture to a gesture signal, and navigating through the data space in response to the gesture signal. The data space is a data-representational space comprising a dataset represented in the physical space.03-11-2010
20100060568CURVED SURFACE INPUT DEVICE WITH NORMALIZED CAPACITIVE SENSING - A curved surface input device with normalized capacitive sensing is disclosed. The input device can normalize capacitive sensing through an overlay having a varying thickness, such as an overlay with a curved surface. The capacitive sensing normalization can be implemented in software, hardware or a combination of software and hardware. A software implementation for normalizing capacitive sensing can comprise adjusting the sensitivity of a sensing operation associated with different sensor elements of the input device. A hardware implementation for normalizing capacitive sensing can comprise adjusting a hardware configuration of the input device associated with one or more physical parameters that can influence the capacitive sensitivity of the sensor elements, such as an area of the sensor elements, a distance between the sensor elements and other conductive input device elements (such as a ground plane), and a dielectric constant associated with the overlay.03-11-2010
20100060567CONTROLLING DEVICE OPERATION RELATIVE TO A SURFACE - Architecture for automatic switching between multiple modes in a handheld device such as a mouse based on the presence of specular light, or lack thereof. When applied to a presenter mouse, the architecture facilitates the automatic switching between mouse mode and presenter mode without manual intervention by the user. An optical approach is well suited since most optical systems include a light source, lenses, and light sensors to detect reflected light from the source (or lack thereof). The approach leverages the existing light source and lenses in a mouse to minimize incremental cost, yet provide a robust technique for detecting lift from the tracking surface thereby automatically switching between modes as the user moves the mouse on and off the tracking surface. A delay circuit and/or image comparison can also be provided that eliminates undesirable triggering to a different mode by preventing unintended switching between the multiple modes.03-11-2010
20110267263CHANGING INPUT TOLERANCES BASED ON DEVICE MOVEMENT - Movement of a device is detected using at least one sensor. In response to the detected movement, at least one value is altered to make it easier for a user to select an object on a display.11-03-2011
20110267261INPUT DEVICE AND CONTROL METHOD OF THE SAME - Disclosed are an input device, a method of controlling the input device and a system including the input device and a display device. The input device may include: a communication unit operable to communicate with a display device; a sensor operable to sense a first signal containing noise and a scan signal generated from the display device, and a second signal containing the noise; a noise eliminator operable to compare the first signal and the second signal; and a controller operable to control the communication unit to output position information of the input device, wherein the position information is based on the comparing of the first signal and the second signal.11-03-2011
20110267258IMAGE BASED MOTION GESTURE RECOGNITION METHOD AND SYSTEM THEREOF - An image based motion gesture recognition method and system thereof are disclosed. In embodiment, a hand posture detection is performed according to the received image frames, to obtain a first hand posture. It is then determined whether the first hand posture matches a predefined starting posture or not. If the first hand posture matches said predefined starting posture, movement tracking is performed according to hand locations on image frames, to obtain a motion gesture. During said movement tracking, the hand posture detection is performed according to said image frames to obtain a second hand posture, and it is determined whether the second hand posture matches a predefined ending posture. If the second hand posture matches the predefined ending posture, the movement tracking is stopped. Therefore, reduce complexity motion gesture recognition can be reduced and the reliability in interaction can be improved.11-03-2011
20100201616SYSTEMS AND METHODS FOR CONTROLLING A DIGITAL IMAGE PROCESSING APPARATUS - A digital image processing apparatus includes a sensing unit configured to sense a user's gesture to perform a specific function and generate a signal representing the user's gesture. The digital image processing apparatus also includes a digital signal processing unit which receives the signal representing the user's gesture and recognizes a plurality of discontinuous gestures as one gesture when a temporal proximity threshold between a plurality of discontinuous gestures is met. The one gesture may represent an input command from the user.08-12-2010
20100201615Touch and Bump Input Control - A touch and motion sensitive input control configured to use a combination of touch sensor output and motion sensor output to determine if an input event has occurred at an input area. The touch and motion sensitive input control can detect a particular input event (e.g., a button press) when a touch sensor detects a touch at a particular input area at around the same time as a motion sensor detects a change in motion. Based on the amount and nature of the motion detected, this can indicate that a user intended to cause an input event other than one caused by a mere touching of the input area.08-12-2010
20100201614Method and Apparatus for Computer Interface - A physical user interface is provided for a microprocessor device that runs an operating system. The interface has an array of sensors located below a workspace having a housing. The workspace divided into regions that are visible to a user, each region signifying a command to or an action performed by the operating system. Counters are provided with stored information such that the counter can be interpreted by the physical user interface so as to indicate a location of the counter as well as the stored information. The information may be directly or indirectly indicative of a request for a URL.08-12-2010
20090002316MOBILE COMMUNICATION DEVICE WITH GAME APPLICATION FOR USE IN CONJUNCTION WITH A REMOTE MOBILE COMMUNICATION DEVICE AND METHODS FOR USE THEREWITH - A mobile communication device includes a processing module that executes a gaming application based on gaming data and that generates display data in response thereto, wherein the gaming data includes first data and second data. A sensor generates the first data in response to the actions of a user. At least one transceiver receives the second data from a remote communication device and that sends the display data to a display device in a gaming mode of operation.01-01-2009
20100127971Methods of rendering graphical images - Methods for defining the complexity and priority of graphics rendering in mobile devices based upon various physical states and factors related to the mobile system including those measured and sensed by the mobile device, such as position, pointing direction and vibration rate, are disclosed and described. In particular, a handheld computing system having an image type user interface includes graphical images generated in response to the instantaneous position and orientation of the handheld device to improve the value of the presented image and overall speed of the system.05-27-2010
20130181895AUTOSTEREOSCOPIC THREE-DIMENSIONAL IMAGE DISPLAY DEVICE USING TIME DIVISION - An autostereoscopic 3D image display device using time division is provided. The image display device includes a backlight, an image display panel, a controller, and a viewer position tracking system. The backlight includes a plurality of line sources which are disposed at certain intervals. The image display panel displays a 3D image. The controller controls the backlight and a viewing point image of the image display panel. The viewer position tracking system determines a pupil position of a viewer and transfers position information to the controller. The image display panel provides two or more viewing points. The line sources configure two or more line source sets that are separately driven at the same time. The two or more line source sets are sequentially driven.07-18-2013
20090167677Method and Apparatus for Providing Communicatons with Haptic Cues - A method and apparatus of generating haptic cues for pacing and monitoring are disclosed. After sensing an event via a component, a process for generating haptic cues generates an input in response to the event. The component, in one example, may be a sensor or a combination of a sensor and a haptic actuator. Upon receipt of the input, the process retrieves a haptic signal from a tactile library in response to the input. A haptic feedback in response to the haptic signal is subsequently generated.07-02-2009
20080231594Haptics Transmission Systems - A method of compensating for network latency in haptics transmission in which the position of a haptic effector is controlled by signals received from a network. The method comprises storing a series of locations of the haptic effector, determining from the series using Fourier Transformation or other means frequencies having a growing amplitude and creating a filter function to eliminate the growing frequencies from output signals directing the force and direction of the haptic effector.09-25-2008
20100271295Force feedback system including multi-tasking graphical host environment and interface device - A force feedback system provides components for use in a force feedback system including a host computer and a force feedback interface device. An architecture for a host computer allows multi-tasking application programs to interface with the force feedback device without conflicts. One embodiment of a force feedback device provides both relative position reporting and absolute position reporting to allow great flexibility. A different device embodiment provides relative position reporting device allowing maximum compatibility with existing software. Information such as ballistic parameters and screen size sent from the host to the force feedback device allow accurate mouse positions and cursor positions to be determined in the force feedback environment. Force feedback effects and structures are further described, such as events and enclosures.10-28-2010
20080278438USER INTERFACE FOR SELECTING A PHOTO TAG - There is disclosed a user interface for selecting a photo tag. In an embodiment, the user interface embodies a method of selecting a photo tag for a tagged photo, comprising: providing a tag entry field for entering a photo tag; in dependence upon a string entered by a user, displaying in a matching tag list any tags from one or more selected tag sources matching the entered string. The method may further comprise displaying a tag type for each tag appearing in the matching tag list. The method may further comprise allowing user selection of a tag in the matching tag list to complete the tag entry field.11-13-2008
20080278439Dual function operation input keys for portable device and operating method therefor - A dual function operation input key module which applies to portable devices and an operating method therefor provide a using status induction switch which is selectively activated actively or passively, and perform a corresponding operation according to the first using status induction or the second using status induction based on the using status induction switch.11-13-2008
20080278437COPYING DOCUMENTS FROM ELECTRONIC DISPLAYS - A system including a Multi Function Printer/Product/Peripheral (MFP) and a portable computing device adapted to allow automatic copying of documents by the MFP from the portable computing device. The portable computing device includes a display, a plurality of sensors for detecting light, a light detection module and a page changing module. The sensors are positioned to detect light from the MFP and trigger a page change automatically. The MFP in accordance with the present invention includes a scanner with a platen, a page change detection module, an image processing module, a device identification module, and a document matching & retrieval module. The page change detection module is adapted to receive the images presented by the portable computing device on the platen of the scanner for copying. The page change detection module detects changes and causes the MFP to scan and output a copy. The MFP may also use image processing techniques to locate an original version of the document on a server and print that higher quality document, instead of the scan from the portable computing device.11-13-2008
20080278442Control apparatus and method for input screens - A control apparatus for input screens includes a display unit, a switch portion and a control unit including a microcomputer. If one of a menu switch of the switch portion and a plurality of dummy switches included in a screen displayed by the display unit is operated, the microcomputer causes the display unit to display a new screen including a plurality of dummy switches. The microcomputer estimates a time period required for the operator to watch a screen to operate the dummy switch, depending on the displayed screen (the number of dummy switches). If the sum of estimated time periods exceeds a reference time period, the microcomputer nullifies operation of the dummy switch to prevent the screen from being switched. After the lapse of a predetermined time period, the microcomputer cancels the nullification of the operation of the dummy switch.11-13-2008
20080278440Limb image interactive apparatus for information products - A limb image interactive apparatus for information products aims to directly maneuver varying image interfaces on a display interface of the information products through movements of user's hands and feet such as dancing, exercises or data entry without relying on conventional input interfaces such as joysticks, stepping pads and screen keyboards. Image interactions on the screen can be directly and interactively maneuvered easily through the movements of user's limbs such as hands and feet to provide a more humanized operation and control interface.11-13-2008
20080284728Method And System For Providing Input Mechanisms On A Handheld Electronic Device - The present invention is related to a system and method for providing input mechanisms in a handheld electronic device. The handheld electronic device includes a casing having a first surface and a second surface, where the second surface faces away from a user of the device when the first surface faces toward the user. An input control is arranged on the second surface. An input mechanism is configured to detect a user input event via the input control. A display is arranged on the first surface and configured to present a visual indication identifying the input control responsive to the user input event.11-20-2008
20080284726System and Method for Sensory Based Media Control - An apparatus for sensory based media control is provided. A system that incorporates teachings of the present disclosure may include, for example, a media device having a controller element to receive from a media controller a first instruction to select an object in accordance with a physical handling of the media controller, and a second instruction to control the identified object or perform a search on the object in accordance with touchless finger movements. Additional embodiments are disclosed.11-20-2008
20080284727Input apparatus and information processing apparatus - An input apparatus connected to an information processing apparatus is provided, which includes an input unit to enter a key, a transmission unit to make itself recognized as a keyboard by an operating system (OS) on the information processing apparatus and transmit to the information processing apparatus a key code signal corresponding to the input from the input unit, a receiving unit to receive from the information processing apparatus a signal representing an electrically driven lighting state of the input apparatus, and a control unit to check the signal received from the information processing apparatus and representing the electrically driven lighting state and control transmission of the key code.11-20-2008
20080284725FOOT-OPERATED KEY PAD - A foot operated data entry pad has a plurality of foot-operated buttons. The foot buttons are used to enter data values—e.g., numbers or symbols separately or in combination. Each button is capable of entering different data values, preferably depending on the length of time that it is pressed or on the number of times that it is pressed in succession. A small controller may be included to allow the user to control the computer's pointer, allowing the user to switch between data entry fields, as with a mouse. A heel rest may serve as both a heel rest and a button/switch for sending an electric/electronic signal. An automated voice system, or other audible and/or visual indicator system, may also be included to help the user keep track of the data value as it changes and is entered. Various embodiments are capable of entering a variety of alphanumeric data rather than a simple binary-type data set, such as yes/no or on/off, or instructions, such as a joystick used with a flight simulator program. Multiple data entry pads may optionally be used in conjunction.11-20-2008
20080284724Remote control systems that can distinguish stray light sources - Remote control systems that can distinguish predetermined light sources from stray light sources, e.g., environmental light sources and/or reflections are provided. The predetermined light sources can be disposed in asymmetric substantially linear or two-dimensional patterns. The predetermined light sources also can be configured to exhibit signature characteristics. The predetermined light sources also can output light at different signature wavelengths. The predetermined light sources also can emit light polarized in one or more predetermined polarization axes. Remote control systems of the present invention also can include methods for adjusting an allocation of predetermined light sources and/or the technique used to distinguish the predetermined light sources from the stray light sources.11-20-2008
20080284723Physical User Interface - A physical user interface is provided as an adjunct to a graphical user interface to a device having an operating system. The physical interface has a work surface or workspace that is scanned by one or more sensors capable of determining the position of objects. The work surface or workspace is sub-divided into two or more regions. Each region is representative of a user-generated command. In some examples, the one or more sensors adapted to determine the position and orientation of one or more counters. The sensors can distinguish which region a counter is located in and what orientation it is in. The sensors provide an output signal, based on the determination, to the device.11-20-2008
20080284729Three dimensional volumetric display input and output configurations - The present invention is a system that allows a number of 3D volumetric display or output configurations, such as dome, cubical and cylindrical volumetric displays, to interact with a number of different input configurations, such as a three-dimensional position sensing system having a volume sensing field, a planar position sensing system having a digitizing tablet, and a non-planar position sensing system having a sensing grid formed on a dome. The user interacts via the input configurations, such as by moving a digitizing stylus on the sensing grid formed on the dome enclosure surface. This interaction affects the content of the volumetric display by mapping positions and corresponding vectors of the stylus to a moving cursor within the 3D display space of the volumetric display that is offset from a tip of the stylus along the vector.11-20-2008
20080291162TRACK WHEEL WITH REDUCED SPACE REQUIREMENTS - An input generating device for use in a hand held electronic device having a housing includes a core formed in a planar and semicircular in shape, forming a peripheral edge extending around said core and a track slidably engaged with the peripheral side edge. A curved portion of the peripheral edge extends outwardly from the housing allowing access thereto by the user. A first input is generated by sliding movement of the flexible track relative to the core. First input detection means, such as a turns encoder switch detects the sliding movement of the track. The core is depressibly mounted within the housing, generating a second input when the core is depressed. A second input detection means, such as a tactile contact switch detects depression of the core.11-27-2008
20080291160System and method for recognizing multi-axis gestures based on handheld controller accelerometer outputs - An example gesture recognition system and method recognizes a gesture made using a handheld control device comprising an accelerometer arrangement. The example system and method involve a database of example gesture inputs derived from accelerometer arrangement outputs generated by making respective gestures with the handheld control device. Corresponding components of a current gesture input and the example gesture inputs in the database are compared using root mean square calculations and the current input gesture is recognized/not recognized based on results of the comparing.11-27-2008
200802911633D Pointing Devices with Orientation Compensation and Improved Usability - Systems and methods according to the present invention describe 3D pointing devices which enhance usability by transforming sensed motion data from a first frame of reference (e.g., the body of the 3D pointing device) into a second frame of reference (e.g., a user's frame of reference). One exemplary embodiment of the present invention removes effects associated with a tilt orientation in which the 3D pointing device is held by a user.11-27-2008
20080291158CHARACTER INPUT DEVICE USING BIO RADAR UNIT AND TILT SENSOR - Disclosed herein is a character input device for a mobile device or a wearable terminal. A bio radar unit senses the positions of the fingers of a user. A tilt sensor senses the tilt of the hands of the user. A microprocessor calculates the final input information of the user by processing signals received from the bio radar unit and the tilt sensor. A wireless communication module transmits the final input information to the mobile device or the wearable terminal of the user. A speaker device outputs a feedback sound corresponding to the final input information of the user. The character input device is wearable on a wrist of the user. The bio radar unit transmits a signal, measures the distance between the character input device and a finger by measuring the strength of a reflected wave reflected from the finger with which the signal collides, and measures the angle of the finger related to activation.11-27-2008
20080291157Multi-direction input device - A multi-direction input device for computers includes a directional wheel, a sliding member and a plurality of electrodes. The directional wheel has a rotational degree-of-freedom and two moving degree-of-freedom to be moved downwards, perform forward rolling, backward rolling and sideward moving thereby to connect corresponding electrodes to generate corresponding signals. In regular conditions the sliding member can be in cooperation with a movable contact to support the directional wheel. When the directional wheel is moved downwards the sliding member drives a corresponding electrode to generate a click signal.11-27-2008
20080303785DISPLAY APPARATUS AND METHOD FOR NOTIFYING USER OF STATE OF EXTERNAL DEVICE - A display apparatus for displaying the state of an external device and a method thereof are provided. According to the present invention, messages indicating the connection state, the power state and the sleep mode of the external device are displayed on a screen of the display apparatus to which a USB is applied. Therefore, the state of an external device which inputs a video signal to the display apparatus, may be shown on the screen, and thus a user can easily know the state of the external device.12-11-2008
20080303784Information processing apparatus and computer-readable storage medium - An information processing apparatus includes a control portion and an IF portion. Haptic sense presentation devices are connected to the IF portion. The control portion calculates an area of an image object based on features of the image object and determines the calculated area of the image object as a virtual mass of the image object. The control portion calculates an acceleration of the image object based on the current and previous features of the image object. The control portion calculates a force to be presented to the haptic sense presentation device connected to IF portion based on the virtual mass and the acceleration of the image object and outputs a signal indicative of the calculated force to the haptic sense presentation device.12-11-2008
20080303783Touchless detection display - A touchless detection display includes a transparent display, a display face, a light detector and light control material. The transparent display displays information to a viewer. The viewer views the information through the display face. The light detector detects incoming light that travels into the touchless detection display through the display face. The light control material receives incoming light that travels into the touchless detection display through the display face before the incoming light reaches the light detector. The light control material prevents portions of the incoming light that are not traveling substantially perpendicular to the display face from reaching the light detector. Locations of objects close to but not touching the display face are detected by the touchless detection display based on the incoming light detected by the light detector.12-11-2008
20120293405DISPLAY DEVICE AND CONTROLLING METHOD - An image display device and controlling method capable of optimizing a state of the image display device for a user at a desired position. The display device includes: an imaging section that takes an image of a predetermined range of a dynamic image with respect to an image display direction; an image analyzing section that analyzes the dynamic image taken by the imaging section and calculates a position of a user; a system optimization processing section that calculates system control information for optimizing a system based on the position of the user calculated by the image analyzing section; and a system controlling section that optimizes the system based on the system control information calculated by the system optimization processing section.11-22-2012
20120293404Low Cost Embedded Touchless Gesture Sensor - An array of independently addressable optical emitters, and an array of independently addressable detectors, energized according to an optimized sequence sensing a performed gesture to generate feature vector frames that are compressed by a projection matrix and processed by a trained model to perform touchless gesture recognition.11-22-2012
20120293403Method for Controlling Display Device by Using Double Buttons - A method for controlling display device by using double buttons is provided. The method comprises the following steps: pressing a power button of a double buttons so as to turn on the display device when the display device is turned off; pressing a menu button of the double buttons so as to start an On Screen Display (OSD) menu on a screen of the display device when the display device is turned on; after entering the OSD menu, pressing the power button or the menu button and surfing the OSD menu;11-22-2012
20120293402MONITORING INTERACTIONS BETWEEN TWO OR MORE OBJECTS WITHIN AN ENVIRONMENT - One or more techniques and/or systems are provided for monitoring interactions by an input object with an interactive interface projected onto an interface object. That is, an input object (e.g., a finger) and an interface object (e.g., a wall, a hand, a notepad, etc.) may be identified and tracked in real-time using depth data (e.g., depth data extracted from images captured by a depth camera). An interactive interface (e.g., a calculator, an email program, a keyboard, etc.) may be projected onto the interface object, such that the input object may be used to interact with the interactive interface. For example, the input object may be tracked to determine whether the input object is touching or hovering above the interface object and/or a projected portion of the interactive interface. If the input object is in a touch state, then a corresponding event associated with the interactive interface may be invoked.11-22-2012
20080309614USER INTERFACE WITH SOFTWARE LENSING FOR VERY LONG LISTS OF CONTENT - A user interface with software tensing may be described. An apparatus may include a user interface module to display an index list, a software lens list, and an aperture box. The index list may represent a list of available options. The software lens list may display a sub-set of the list of available options that coincides with a position of the aperture box on the index list. The apparatus may also include a media lensing module to increase a size of an option in the software lens list when a pointer approaches or coincides with the option. Other embodiments are described and claimed.12-18-2008
20080204401Control apparatus - The present invention is to provide a control apparatus, which includes a casing with two sides each comprising a holding portion extending outwardly and towards a bottom thereof for being held by a user while viewing a front of the casing, the front is inclined progressively from a lateral edge of the front away from the user through the bottom to an opposite lateral edge of the front near the user; a direction input portion disposed on the front of the casing near one of the holding portions; an operation input portion disposed on the front of the casing near the other holding portion; and a sensing input portion disposed on the front of the casing between the direction input portion and the operation input portion and proximate to the user. Thus, a user can press or push any input portion easily for an input while holding the two holding portions by hands.08-28-2008
20090015549Accepting User Input - The invention is directed to an improvement in mechanisms and techniques for accepting user input. An apparatus for accepting a user input is described comprising a display, and a plate and a control knob positioned over the display. A portion of the display is visible through the plate, and a portion of the control knob is optically transparent such that information displayed by the display is visible through the control knob. In some examples, the control knob functions as both a rotary input and as a push button input. The control knob may function as a push button input through a transfer of force through the plate to a pressure sensing switch associated with the plate. The control knob may function as a rotary control through drive gears, belts, or interaction with a light emitter and detector.01-15-2009
20110205149MULTI-MODAL INPUT SYSTEM FOR A VOICE-BASED MENU AND CONTENT NAVIGATION SERVICE - A system and method for providing voice prompts that identify task selections from a list of task selections in a vehicle, where the user employs an input device, such as a scroll wheel, to activate a particular task and where the speed of the voice prompt increases and decreases depending on how fast the user rotates the scroll wheel.08-25-2011
20080303786DISPLAY DEVICE - An object of the present invention is to achieve an advanced input operation without complicating image processing. A display device of the present invention includes a display unit, an optical input unit, and an image processor. The display unit displays an image on a display screen. The optical input unit captures an image of an object approaching the display screen. The image processor detects that the object comes into contact with the display screen on the basis of a captured image captured by the optical input unit, and then performs image processing to obtain the position coordinates of the object. In the display device, the image processor divides the captured image into a plurality of regions, and performs the image processing on each of the divided regions.12-11-2008
20090051648GESTURE-BASED MOBILE INTERACTION - Gesture-based mobile interaction, motion of a device is sensed using image data, and a gesture corresponding to the sensed motion of the device is recognized. Functionality of the device corresponding to the recognized gesture is determined and invoked.02-26-2009
20080316169METHOD CIRCUIT AND SYSTEM FOR INTERFACING WITH AN ELECTRONIC DEVICE - According to some embodiments of the present invention, there is provided an interface apparatus for a multi-application electronic device, including a human interface surface having integrated presentation and sensing elements, such that the device has substantially full functionality for substantially all applications without the use of other human interfaces.12-25-2008
20110001694OPERATION CONTROL APPARATUS, OPERATION CONTROL METHOD, AND COMPUTER PROGRAM - An operation control apparatus is provided which includes a detection unit for detecting contact of an operation tool with a display surface of a display unit, a contact determination unit for determining a contact state of the operation tool with the display surface based on the detection result by the detection unit, a contact area recognition unit for recognizing, in the case where it is determined by the contact determination unit that the operation tool is in contact with the display surface, a contact area where the operation tool is in contact with the display surface, and an operation determination unit for determining, from a plurality of operation processing associated with an act of the operation tool in contact with the display surface, an operation processing to be executed, based on a size of the contact area recognized by the contact area recognition unit.01-06-2011
20080266247WIRELESS CONTROL OF MULTIPLE COMPUTERS - A system comprises a plurality of computers, a first wireless input device adapted to control any of the computers via wireless communication, and selection logic coupled to the first wireless input device. The selection logic enables a user to select one of the computers to be controlled by the first input device.10-30-2008
20110006978IMAGE MANIPULATION BASED ON TRACKED EYE MOVEMENT - The disclosure relates to controlling and manipulating an image of an object on a display device based on tracked eye movements of an observer. When the object is displayed according to an initial view, the observer's eye movement is tracked and processed in order to determine the focus of the observer's attention or gaze on the image. Thereafter, the displayed image is modified to provide a better view of the part of the object in which the observer is most interested. This is accomplished by modifying at least one of the spatial positioning of the object within the viewing area, the angle of view of the object, and the viewing direction of the object.01-13-2011
20110006979SYSTEM FOR CONTROLLING BRIGHTNESS FLICKER OF PARALLAX BARRIER LCD THAT HAS WIDE VIEWING ANGLE AND METHOD THEREOF - A system for controlling brightness flicker of a parallax barrier LCD having a wide viewing angle capable of minimizing brightness flicker by adjusting a permittivity curve depending on different times into a predetermined waveform when split barriers are on/off by movement of a viewer's viewing angle, and a method thereof. A method of controlling brightness flicker of a parallax barrier LCD having a wide viewing angle for controlling brightness of a display providing a stereoscopic image by acquiring a real-time image of a viewer, recognizing an image of the viewer and extracting locations and coordinates of eyes of the viewer, and controlling turn-on/off of split barrier electrodes.01-13-2011
20110006977SYSTEM AND METHOD FOR CONVERTING GESTURES INTO DIGITAL GRAFFITI - The subject disclosure provides a device, computer readable storage medium, and method for converting gestures undergone by a device into digital graffiti. The disclosure includes ascertaining an orientation of the device and a path traversed by the device. Gestures undergone by the device are identified as a function of the orientation and the path. Digital graffiti corresponding to the gestures are then superimposed onto a digital canvas.01-13-2011
20130215017METHOD AND DEVICE FOR DETECTING GESTURE INPUTS - A method is provided for detecting gesture inputs in response to a consecutive reciprocating movement before a detecting device, wherein, the consecutive reciprocating movement is made of a first type of gesture and a second type of gesture, each capable of being recognized by the detecting device to output a different control signal. The method comprises the steps of receiving the consecutive reciprocating movement starting with a first type of gesture among the two types, wherein, the first type of gesture and the second type of gesture occur alternately, and outputting control signals corresponding to the first type of gesture with times number equaling to the number of the first of type gesture contained within the consecutive reciprocating movement.08-22-2013
20130215012UNDERWATER IMAGE PROJECTION DISPLAY SYSTEM, LIGHTING CONTROL SYSTEM AND DEVICE AND METHOD OF OPERATING SAME - An underwater image projection system submerged in a body of water and projecting an image within said body of water is provided having an enclosure with a lens assembly. A projection element has a light source projecting an image within the body of water from the projection element with an at least one light source steering the image. A system controller is coupled to and controls the projected light source, the projected light source steering device and the further image steering device. A user inputs image data to the controller through the user input device and the controller interprets the image data into a set of image control variables and executes control of the projected light source and image source and further image steering device in coordination and projects the image through the projection element to project from underwater a static or animated image on an underwater surface of the body of water.08-22-2013
20130215014CAMERA BASED SENSING IN HANDHELD, MOBILE, GAMING, OR OTHER DEVICES - Method and apparatus are disclosed to enable rapid TV camera and computer based sensing in many practical applications, including, but not limited to, handheld devices, cars, and video games. Several unique forms of social video games are disclosed.08-22-2013
20130215016DYNAMIC IMAGE DISTRIBUTION SYSTEM, DYNAMIC IMAGE DISTRIBUTION METHOD AND DYNAMIC IMAGE DISTRIBUTION PROGRAM - [Problem] To provide a dynamic image distribution system, dynamic image distribution method, and dynamic image distribution program enabling the arbitrary setting of a viewing range for compression-encoded dynamic images and the interactive modification of the viewing range.08-22-2013
20130215007PORTABLE ELECTRONIC DEVICE AND CONTROL METHOD THEREOF - A portable electronic device including a control unit, an ambient light sensor, a proximity sensor and a panel is provided. The ambient light sensor is electrically connected to the control unit. The proximity sensor is electrically connected to the control unit. The panel is electrically connected to the control unit. When the portable electronic device is in a normal operation state and the panel is in an enabled state, if the ambient light sensor detects that the ambient light is dark and the proximity sensor detects that an object is close to a predetermined distance, the control unit switches the portable electronic device into one of a sleep mode, a hibernation mode or a shutdown mode, otherwise, the portable electronic device maintains in the normal operation state.08-22-2013
20130215006METHODS AND APPARATUS FOR AUTOMATIC TV ON/OFF DETECTION - Methods and apparatus are disclosed for automatic TV ON/OFF detection. An example method includes detecting a power state of an information presentation device. The method includes comparing, using a processor, a measurement indicative of an amount of power drawn by the information presentation device to a first threshold. The method also includes comparing the measurement to a second threshold. The method also includes storing an indication that the information presentation device is in an indeterminate state if the measurement is greater than the first threshold and less than the second threshold.08-22-2013
20110205152LOCATION-BASED INFORMATION - In response to a positional relationship (distance, or a combination of distance and angle) between an information output unit and a user who uses information displayed in the display unit, a control unit changes the amount of information to be displayed in the display unit based on an information level, the number of pieces of information, a scrolling speed or a cycle. In some aspects, the control unit controls the information output unit to increase the amount of information to be output when the user is close to the display unit.08-25-2011
20110205148Facial Tracking Electronic Reader - Facial actuations, such as eye actuations, may be used to detect user inputs to control the display of text. For example, in connection with an electronic book reader, facial actuations and, particularly, eye actuations, can be interpreted to indicate when the turn a page, when to provide a pronunciation of a word, when to provide a definition of a word, and when to mark a spot in the text, as examples.08-25-2011
20090160762User input device with expanded functionality - The present invention can include systems and methods for expanding the functionality of user input devices. In particular, the present invention can expand the functionality of user input devices by changing the functions assigned to hardwired user input mechanisms responsive to user actuation of a function-change user input mechanism and/or responsive to automatic detection of an application change. Each hardwired user input mechanism can have an associated function indicator that visually indicates the function assigned to the hardwired user input mechanism. The present invention also can change the function indicated by the function indicators when there is a change in the functions assigned to the hardwired user input mechanisms.06-25-2009
20130120239INFORMATION PROCESSING DEVICE, CONTROL METHOD, AND PROGRAM - There is provided an information processing device including a display screen having flexibility; a deflection detection unit configured to detect deflection of the display screen; and a control unit configured to recognize a change in the deflection detected by the deflection detection unit as an on operation input and output a corresponding process command.05-16-2013
20130120240APPARATUS AND METHOD FOR CONTROLLING IMAGE DISPLAY DEPENDING ON MOVEMENT OF TERMINAL - A method for controlling image display based on movement of a terminal is provided. The method includes displaying any one image in an image list, when an image view feature runs; detecting movement of the terminal; and sequentially replacing and outputting images in the image list based on at least one of a moving direction and a moving angle of the terminal, upon detecting the movement of the terminal.05-16-2013
20130120241CONTROL DEVICE FOR PROVIDING A RECONFIGURABLE OPERATOR INTERFACE - A control device for providing a reconfigurable operator interface for a medical apparatus is provided. The control device comprises a control surface comprising at least one control actuator arranged on the control surface. The control device further comprises a plurality of modules arranged side-by-side and attached to each other via adjacent abutting side surfaces in a liquid-tight manner, wherein the plurality of modules comprise a first terminal module having a closing end surface and an abutting side surface opposite the closing end surface, and a second terminal module having an abutting side surface facing towards the abutting side surface of the first terminal module, and an opposite closing end surface. Operation of the control device is enabled when all the abutting side surfaces of the plurality of modules are in an attached state.05-16-2013
20130120242OPTICAL POINTING DEVICE AND ELECTRONIC DEVICE INCLUDING SAME - An optical pointing device is configured so that: an optical unit and a lens unit are integrally molded and form a portion of an optical cover; each of a light transmitting resin layer and the optical cover is made of a resin containing thermosetting resin as a main component; and a light shielding resin layer is made of a resin containing, as a main component, thermosetting resin and/or thermoplastic resin that has heat resistance. Thereby, the present invention provides an optical pointing device whose optical characteristics, reliability and heat resistance are excellent and whose number of component members are reduced.05-16-2013
20130120243DISPLAY APPARATUS AND CONTROL METHOD THEREOF - Disclosed are a display apparatus and a control method thereof, the display apparatus including: an image acquirer which acquires an image of a plurality of users; a display which displays the image acquired by the image acquirer; and a controller which selects a user making a predetermined gesture among the plurality of users in the image and controls the display apparatus to perform an operation corresponding to the selected user out of operations which are capable of being performed by the display apparatus when the image of the plurality of users is acquired through the image acquirer and the predetermined gesture is recognized from the acquired image.05-16-2013
20130120244Hand-Location Post-Process Refinement In A Tracking System - A tracking system having a depth camera tracks a user's body in a physical space and derives a model of the body, including an initial estimate of a hand position. Temporal smoothing is performed in which some latency is imposed when the initial estimate moves by less than a threshold level from frame to frame, while little or no latency is imposed when the movement is more than the threshold. The smoothed estimate is used to define a local volume for searching for a hand extremity to define a new hand position. Another process generates stabilized upper body points that can be used as reliable reference positions, such as by detecting and accounting for occlusions. The upper body points and a prior estimated hand position are used to define an arm vector. A search is made along the vector to detect a hand extremity to define a new hand position.05-16-2013
20130120245User Interface Devices - A method and apparatus of user interface (“UI”) having multiple motion dots capable of detecting user inputs are disclosed. In one embodiment, a digital processing system includes a first motion dot, a second motion dot, and a device. The first motion dot can be attached to a first location of a user's body and the second motion dot may be attached to the second location of the user's body. The first motion dot, for example, includes accelerometers able to identify a physical location of the first motion dot and the second motion dot also includes accelerometers capable of detecting an input generated based on relative physical position between the first motion dot and the second motion dot. The device, which is logically coupled to the second motion via a wireless connection, is configured to store the input in a local storage.05-16-2013
20130120246METHOD AND APPARATUS FOR USING BIOPOTENTIALS FOR SIMULTANEOUS MULTIPLE CONTROL FUNCTIONS IN COMPUTER SYSTEMS - A biosignal-computer-interface apparatus and method. The apparatus includes one or more devices for generating biosignals based on at least one physiological parameter of an individual, and a computer-interface device capable of performing multiple tasks, including converting the biosignals into at least one input signal, establishing a scale encompassing different levels of the input signal, multiplying the input signal into parallel control channels, dividing the scale into multiple zones for each of the parallel control channels, assigning computer commands to each individual zone of the multiple zones, and generating the computer command assigned to at least one of the individual zones if the level of the input signal is within the at least one individual zone. The individual zones can be the same or different among the parallel control channels.05-16-2013
20130120247THREE DIMENSIONAL DISPLAY DEVICE AND THREE DIMENSIONAL DISPLAY METHOD - Provided is a three dimensional display device that includes: three dimensional image display unit (05-16-2013
20090128484INFORMATION PROCESSING APPARATUS, SCROLL CONTROL APPARATUS, SCROLL CONTROL METHOD, AND COMPUTER PROGRAM PRODUCT - According to one embodiment, an information processing apparatus including a display unit and an inclination detection unit detecting an inclination of its main body includes a direction instruction unit, a reference inclination storage unit, an inclination difference output unit, and a scroll unit. The direction instruction unit instructs a scroll direction in which a display range of the display unit is to be moved. The reference inclination storage unit stores, as a reference inclination, an inclination in the scroll direction among inclinations detected by the inclination detection unit. The inclination difference output unit outputs a difference between the reference inclination and an inclination in the scroll direction. The scroll unit moves the display range on the display unit according to the difference outputted from the inclination difference output unit.05-21-2009
20090128480Device for connecting an electronic unit to screens of different types without distinction, and a corresponding screen - The invention relates to a device for connecting an electronic unit (05-21-2009
20110267262Laser Scanning Projector Device for Interactive Screen Applications - One embodiment of the device comprising: (i) a laser scanning projector that projects light on a diffusing surface illuminated by the scanning projector; (ii) at least one detector that detects, as a function of time, the light scattered by the diffusing surface and by at least one object entering area illuminated by the scanning projector; and (iii) an electronic device capable of (a) reconstructing, from the detector signal, an image of the object and of the diffusing surface and (b) determining variation of the distance between the object and the diffusing surface11-03-2011
20090160764Remote Control System - A remote control system, comprising a hand-operated remote control device and a control menu present in a display unit. Picking up the remote control device by hand activates a motion or position or push-button controlled user interface in the display unit. The remote control device is provided with an identification feature for its pick-up by hand, which activates the control menu and/or the remote control device. The motional and/or positional handling of the remote control device enables controlling the display unit's menus included in the user interface.06-25-2009
20090160765INPUTTING UNIT, INPUTTING METHOD, AND INFORMATION PROCESSING EQUIPMENT - According to one embodiment, an inputting unit for inputting a first control instruction to change a display status of a screen, includes: an acceleration acquiring section configured to acquire an acceleration applied to the inputting unit; an acceleration determining section configured to output a first signal when the acceleration exceeds a certain numerical value; a user's instruction acquiring section configured to allow a user to input a second signal at the inputting unit at any point of time; and a focus change determining section configured to output a second control instruction to move a focus in the screen when the first signal and the second signal are input.06-25-2009
20090160763Haptic Response Apparatus for an Electronic Device - A user input for an electronic device includes a haptic feedback layer (06-25-2009
20110205153DATA TRANSMISSION DEVICE, DATA TRANSMISSION METHOD, DATA COMMUNICATION SYSTEM, PROGRAM, AND RECORDING MEDIUM - A mobile phone (08-25-2011
20090128482Approach for offset motion-based control of a computer - A system for controlling a computing device. The system includes, a plurality of sensed locations corresponding to a sensed object, a sensing apparatus to sense a position of the sensed locations relative to the sensing apparatus, and a motion control engine executable on a computing device, in response to the motion control engine receiving position data indicative of the position of the sensed locations from the sensing apparatus, the motion control engine to generate an adjusted position based on the position data, wherein the adjusted position is offset from the position of the sensed locations, and wherein the adjusted position is fixed relative to the position of the sensed locations.05-21-2009
20090128481INTEGRATED SYSTEM WITH COMPUTING AND IMAGING CAPABILITES - An integrated system comprising both imaging and computing capabilities comprises a light valve and a CPU, as well as other functional members for performing computing and imaging.05-21-2009
20100033426Haptic Enabled Gaming Peripheral for a Musical Game - A haptic enabled gaming peripheral that simulates a musical instrument includes a body, a first sensing element and a first actuator. A processor, located within the body of the gaming peripheral, communicates with a host computer running a software program corresponding to a musical game. The first sensing element, disposed within the body and coupled to the processor, senses an input from the user. The sensed input is communicated to the host processor. The first actuator, disposed within the body and coupled to the processor, outputs a haptic effect in response to receiving an activating signal based on an event that occurs in the software program. In some implementations, the first sensed element is disposed proximate to the first actuator so that the user perceives the haptic effect in response to providing the input.02-11-2010
20110025596STORAGE MEDIUM HAVING GAME PROGRAM STORED THEREIN, GAME APPARATUS, AND TILT ANGLE CORRECTION METHOD - An information processing device performs a game process based on a tilt angle of an input device that can be rotated to any tilt about a predetermined axis. First, a game apparatus calculates a tilt angle representing the tilt of the input device. Then, the game apparatus determines whether the calculated tilt angle has transitioned across the boundary between the upper limit value and the lower limit value of the tilt angle. If the tilt angle has transitioned across the boundary, the tilt angle to be used in a predetermined information process is corrected to a predetermined value that is on one side of the boundary on which the tilt angle was before crossing the boundary.02-03-2011
20120139832Head Pose Assessment Methods And Systems - Improvements are provided to effectively assess a user's face and head pose such that a computer or like device can track the user's attention towards a display device(s). Then the region of the display or graphical user interface that the user is turned towards can be automatically selected without requiring the user to provide further inputs. A frontal face detector is applied to detect the user's frontal face and then key facial points such as left/right eye center, left/right mouth corner, nose tip, etc., are detected by component detectors. The system then tracks the user's head by an image tracker and determines yaw, tilt and roll angle and other pose information of the user's head through a coarse to fine process according to key facial points and/or confidence outputs by pose estimator.06-07-2012
20100277412Camera Based Sensing in Handheld, Mobile, Gaming, or Other Devices - Method and apparatus are disclosed to enable rapid TV camera and computer based sensing in many practical applications, including, but not limited to, handheld devices, cars, and video games. Several unique forms of social video games are disclosed.11-04-2010
20120068922Remote Control Functionality Including Information From Motion Sensors - According to some aspects, the invention provides methods and apparatuses for incorporating motion sensors into a full function remote control. In addition to using movement for cursor location on an associated display, the invention can use the motion sensor information in many new and useful ways. As one example, information about movement along the ±Z axis can be used to activate a “zoom in” function when remote pointed toward the screen and “zoom out” when it is pulled back. As another example, a remote control incorporating the invention can include controls on two opposite sides, and the motion sensors can be used to activate controls on one side of the device and deactivate controls on the other side based on its orientation.03-22-2012
20120068920METHOD AND INTERFACE OF RECOGNIZING USER'S DYNAMIC ORGAN GESTURE AND ELECTRIC-USING APPARATUS USING THE INTERFACE - A method of recognizing a user's dynamic organ for use in an electric-using apparatus includes scanning a target image inputted through an imaging element using a window; generating a HOG descriptor of a region of the target image that is scanned when it is judged that the scanned region includes a dynamic organ; measuring a resemblance value between the HOG descriptor of the scanned region and a HOG descriptor of a query template for a gesture of the dynamic organ; and judging that the scanned region includes the gesture of the dynamic organ when the resemblance value meets a predetermined condition.03-22-2012
20120068919SENSOR - A magnetic attachment mechanism and method is described. The magnetic attachment mechanism can be used to releasably attach at least two objects together in a preferred configuration without fasteners and without external intervention. The magnetic attachment mechanism can be used to releasably attach an accessory device to an electronic device. The accessory device can be used to augment the functionality of usefulness of the electronic device.03-22-2012
20120068917SYSTEM AND METHOD FOR DYNAMIC GESTURE RECOGNITION USING GEOMETRIC CLASSIFICATION - A gesture recognition system and method that inputs videos of a moving hand and outputs the recognized gesture states for the input sequence. In each image, the hand area is segmented from the background and used to estimate parameters of all five fingers. The system further classifies the hand image as one of the postures in the pre-defined database and applies a geometric classification algorithm to recognize the gesture. The system combines a skin color model with motion information to achieve real-time hand segmentation performance, and considers each dynamic gesture as a multi-dimensional volume and uses a geometric algorithm to classify each volume.03-22-2012
20090184921Input Through Sensing of User-Applied Forces - Methods and devices for providing a user input to a device through sensing of user-applied forces are described. A user applies forces to a rigid body as if to deform it and these applied forces are detected by force sensors in or on the rigid body. The resultant force on the rigid body is determined from the sensor data and this resultant force is used to identify a user input. In an embodiment, the user input may be a user input to a software program running on the device. In an embodiment the rigid body is the rigid case of a computing device which includes a display and which is running the software program.07-23-2009
20090140977Common User Interface Structure - A common user interface structure is described. In embodiment(s), a common user interface structure includes proportional geometry variables that can be adjusted such that the common user interface structure is scaled for display on media devices that each have different sized display screens. The common user interface structure includes a dimension control variable from which the proportional geometry variables are derived to scale the common user interface structure for display. The common user interface structure can also include menu item regions that include selectable content links to initiate rendering media content, and the menu item regions are scaled for display in the common user interface structure when the proportional geometry variables are adjusted.06-04-2009
20090015547Electronic Device with Physical Alert - An electronic device (01-15-2009
20110221672HAND-WORN CONTROL DEVICE IN AN AUGMENTED REALITY EYEPIECE - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content. A control device worn on a hand of a user of the eyepiece includes a control component that when actuated by a digit of the hand of the user, provides a control command from the actuation of the control component to the processor as a command instruction.09-15-2011
20110221666Methods and Apparatus For Gesture Recognition Mode Control - Computing devices can comprise a processor and an imaging device. The processor can be configured to support both a mode where gestures are recognized and one or more other modes during which the computing device operates but does not recognize some or all available gestures. The processor can determine whether a gesture recognition mode is activated, use image data from the imaging device to identify a pattern of movement of an object in the space, and execute a command corresponding to the identified pattern of movement if the gesture recognition mode is activated. The processor can also be configured to enter or exit the gesture recognition mode based on various input events.09-15-2011
20090251406System and Method for Selective Activation and Deactivation of an Information Handling System Input Output Device - An information handling system has a chassis and a lid, the lid rotating between a closed position and a tablet position. An indicator coupled to the lid, such as a magnet, aligns with a first detector coupled to a chassis of the information handling system, such as a Hall effect detector, in the tablet position so that the detector signals to a position detector module to disable a keyboard of the information handling system. The indicator aligns with a second detector coupled to the chassis in a closed configuration so that the second detector signals to the position detector module to enter a power down state.10-08-2009
20090066639VISUAL RESPONSES TO A PHYSICAL INPUT IN A MEDIA APPLICATION - A media application for providing outputs (e.g., audio outputs) in response to inputs received from an input device is provided. The media application may connect input mechanisms of an input device with parameters of channel strips (e.g., which may define output sounds) using an intermediate screen object. The media application may first assign an input mechanism to a screen object, and separately map a screen object to a channel strip parameter. The media application may map a screen object to several channel strips simultaneously such that, based on the value of the screen object, the volume of each of the several channel strips changes. The media application may provide a graphical representation of available channel strips using layers. As the media application accesses a channel strip, the appearance of the portion of the layer associated with the channel strip may change. The media application may also allow the patches, which may include several channel strips, to survive after a new patch is selected instead.03-12-2009
20090066637HANDHELD ELECTRONIC DEVICE WITH MOTION-CONTROLLED DISPLAY - A handheld electronic device includes a display, a memory configured to store a map, and a motion sensor configured to monitor the movement of the handheld electronic device. A controller is coupled to the display, the memory, and the motion sensor. The controller is configured to generate an image on the display representative of a portion of the map, the image having a field of view (FOV). The controller is also configured to adjust the FOV of the image based upon the movement of the handheld electronic device as detected by the motion sensor.03-12-2009
20090051647PORTABLE ELECTRONIC DEVICE WITH MOTION SENSING MODULE - A portable electronic device includes a display module for displaying a menu, a motion sensing module, and a controlling module. The menu having a plurality of menu options. The motion sensing module detects a motion of the portable electronic device imparted by a user and generates a trigger signal associated with the motion of the portable electronic device. The controlling module controls choosing and/or executing a menu option in response to the trigger signal.02-26-2009
20120169587ELECTRONIC DEVICE AND CONTROL METHOD FOR THE SAME - An electronic device of the present invention includes an input acceptance section that accepts an input of a command that causes any one of a plurality of operation states to be selected; a function section (07-05-2012
20120068921WIRELESS VIDEO HEADSET WITH SPREAD SPECTRUM OVERLAY - Enhanced Bluetooth and/or cellular frequency hopping radios are integrated into a hands-free wireless mobile computing and video display headset. Forms of these enhanced headsets incorporating the enhanced frequency hopping spread spectrum radio technology are of interest to military, police, fire fighters, first responders and certain commercial companies such as utility companies seeking private cellular systems seeking enhanced communication privacy.03-22-2012
20120068918METHOD AND APPARATUS FOR ELECTRONIC READER OPERATION - Methods and apparatus are provided for operation of an electronic reader. In one embodiment, a method includes detecting a user command to initiate playback of the digital text, detecting a playback setting for the digital text based on the user command, displaying a first portion of the digital text by the electronic reader, and updating the display of the digital text by the electronic reader, wherein a second portion of the digital text is automatically displayed based on the playback setting for the digital text.03-22-2012
20110141009IMAGE RECOGNITION APPARATUS, AND OPERATION DETERMINATION METHOD AND PROGRAM THEREFOR - An image for an operator is extracted, and an operation determination unit employs a relationship, relative to a marker of an operation input system, for the operator, who is standing behind the marker when viewed by a video camera. When a body part of the operator comes to the front of an operation plane, as viewed by the video camera, the operation determination unit determines that an action for an input operation has been started, and examines the shape and the movement of each individual body part (an open hand, two fingers that are being held up, etc.), and determines whether the shape or the movement is correlated with one of operations that are assumed in advance.06-16-2011
20110141010GAZE TARGET DETERMINATION DEVICE AND GAZE TARGET DETERMINATION METHOD - The object at which a user is gazing Is accurately determined from among objects displayed on a screen. A gaze target determination device (06-16-2011
20110141008MULTIMEDIA PLAYING DEVICE AND OPERATION METHOD THEREOF - A multimedia playing device and an operation method thereof are provided. The multimedia playing device stores an image having a plurality of multimedia options. The operation method sequentially includes the following steps. A portion of the image is defined as a projecting area. The portion of the image corresponding to the projecting area is projected. A movement of the multimedia playing device is sensed. According to the movement of the multimedia playing device, the projecting area is moved in the image. Thereby, the multimedia options are presented in a projecting mode, and a user can browse the multimedia options in a dynamic sensing mode thus to facilitate search and selection.06-16-2011
20110141006DETECTING DOCKING STATUS OF A PORTABLE DEVICE USING MOTION SENSOR DATA - Methods for operating a portable media device are provided. The method includes determining an orientation angle of the portable media device and a frequency spectrum associated with a motion of the portable media player. Based on the orientation angle and the frequency spectrum, the portable media player can determine a motion status and select a mode of operation based on the motion status. In addition, the method also includes determining whether the portable media player is in a dock, resting on a surface, or being handled by a person. The method further includes determining whether the portable media player is located in a moving vehicle, in a stationary vehicle, being held by a moving person, or being held by a stationary person.06-16-2011
20090096746Method and Apparatus for Wearable Remote Interface Device - A method and apparatus of using a wearable remote interface device capable of detecting inputs from movements are disclosed. The wearable remote interface device, which could be attached to a finger or a hand or any parts of a body, includes a sensor, a filter, an input identifier, and a transmitter. The sensor, in one embodiment, is capable of sensing the movement of the finger or any part of body in which the wearable remote interface device is attached with. Upon detecting the various movements associated with the finger, the filter subsequently removes any extraneous gestures from the detected movements. The input identifier, which could be a part of the filter, identifies one or more user inputs from the filtered movements. The transmitter transmits the input(s) to a processing device via a wireless communications network.04-16-2009
20090096747Method of rolling picture using input device - A method of rolling picture by using input device is disclosed. The input device has a housing and a rotatable component relative to the housing, and through rotating the component, an instruction signal for rolling picture being produced. The method includes steps of setting picture rolling, wherein the input device is set to have at least a first mode or a second mode for rolling the picture, in which each mode is set to have a picture rolling displacement which is corresponding to the driven picture rolling by the single instruction signal every time, and different modes have different displacements; and deciding the mode, wherein a standard value which is compared with the number of instruction signal generated per unit time for deciding the mode of the instruction signal is provided, and through the standard value, the instruction signal is decided to enter the first mode or the second mode.04-16-2009
20120194415DISPLAYING AN IMAGE - Devices, methods, and systems for displaying an image are described herein. One or more device embodiments include a user interface configured to display an image, a motion sensor configured to sense movement of the device, and a processor configured to convert the movement of the device to a corresponding movement of the display of the image.08-02-2012
20110227821ELECTRONIC BOOK WITH BUILT-IN CARD SCANNER - An electronic book includes a housing having a first body section and a second body section pivotable with respect to each other, the first and second body sections together defining a pair of internal faces and a pair of external faces; a spine member coupling the first and second body sections, the spine member defining a longitudinal cavity therethrough and a longitudinal slot providing entry into the longitudinal cavity; a card slot provided on one of the external faces; a flexible display screen extending across both internal faces; microprocessor circuitry positioned in the housing between the card slot and the display screen; and a card scanner mounted in the housing between the microprocessor circuitry and the card slot, the card scanner facing away from the display screen and configured to scan a card inserted into the card slot and to convert a two-dimensional pattern on the card into data signals. In a pivoted position where pair of internal faces oppose each other, a middle portion of the flexible display screen extends into the longitudinal cavity of the spine member via the longitudinal slot to form a loop.09-22-2011
20110227819INTERACTIVE THREE-DIMENSIONAL DISPLAY SYSTEM AND METHOD OF CALCULATING DISTANCE - An interactive three-dimensional display system includes a three-dimensional display panel which has an optical sensor array, an interactive device which includes a projection light source and a shadow mask, and an image recognizing unit. The shadow mask has a pattern to define an image projected by the interactive device. The image is captured by the optical sensor array. The pattern includes two strip patterns which cross each other. The image includes two strip images which cross each other. The image recognizing unit is electrically connected with the optical sensor array and calculates relative positions of the interactive device and the three-dimensional display panel according to the image. A method of calculating the relative positions includes calculating according to the lengths of one of the strip patterns and one of the strip images, and a divergent angle and tilt angle of the projection light source.09-22-2011
20110227820LOCK VIRTUAL KEYBOARD POSITION IN AN AUGMENTED REALITY EYEPIECE - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content. The displayed content is an interactive control element. An integrated camera facility of the eyepiece images the surrounding environment, and identifies a user hand gesture as an interactive control element location command, wherein the location of the interactive control element remains fixed with respect to an object in the surrounding environment, in response to the interactive control element location command, regardless of a change in the viewing direction of the user.09-22-2011
20090231272VIRTUAL HAND: A NEW 3-D HAPTIC INTERFACE AND SYSTEM FOR VIRTUAL ENVIRONMENTS - The present invention discloses a creation of a virtual hand, body part, or tool in a virtual environment, controlled by a new 3-D haptic interface for virtual environments. There is also the provision of a method and arrangement applicable to computer systems for creating a virtual environment, which facilitates a user to touch, feel, edit, and interact with data about the objects, surfaces, and textures in the environment. There is also a provision for multiple virtual hands operating in the same virtual world, so that users can work collaboratively in the virtual world; they can touch, feel and edit the data in the virtual world, including data about the other users' virtual hands.09-17-2009
20090231271Haptically Enabled User Interface - A device has a user interface that generates a haptic effect in response to user inputs or gestures. In one embodiment, the device receives an indication that the user is scrolling through a list of elements and an indication that an element is selected. The device determines the scroll rate and generates a haptic effect that has a magnitude that is based on the scroll rate.09-17-2009
20090231270CONTROL DEVICE FOR INFORMATION DISPLAY, CORRESPONDING SYSTEM, METHOD AND PROGRAM PRODUCT - The invention concerns a control device for an information display including a video sensor for receiving image information a processor for determining control information within said image information, wherein subsequent data transmission between said control device and said information display is dependent upon the origin of the received image information as determined by the control information. In addition, a system, a method and a program product are also targets of the present invention.09-17-2009
20090231269INPUT DEVICE, SIMULATED EXPERIENCE METHOD AND ENTERTAINMENT SYSTEM - A retroreflective sheet 09-17-2009
20110221673SYSTEM AND METHOD FOR MONITORING A MOBILE COMPUTING PRODUCT/ARRANGEMENT - Described is a system and method for monitoring a mobile computing Arrangement. The arrangement may include a sensor and a processor. The sensor detects first data of an event including a directional orientation and a motion of the arrangement. The processor compares the first data to second data to determine if at least one predetermined procedure is to be executed. The second data may include a predetermined threshold range of changes in the directional orientation and the motion. If the predetermined procedure is to be executed, the processor selects the predetermined procedure which corresponds to the event as a function of the first data. Subsequently, the predetermined procedures is executed.09-15-2011
20110221669GESTURE CONTROL IN AN AUGMENTED REALITY EYEPIECE - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content. The eyepiece includes an integrated camera facility that images a gesture, wherein the integrated processor identifies and interprets the gesture as a command instruction.09-15-2011
20130215015SYSTEM AND METHODS FOR ENHANCED REMOTE CONTROL FUNCTIONALITY - A hand-held device having a touch sensitive surface uses a relative distance from an origin location to each of a plurality of touch zones of the touch sensitive surface activated by a user to select a one of the plurality of touch zones as being intended for activation by the user.08-22-2013
20130215013MOBILE COMMUNICATION TERMINAL AND METHOD OF GENERATING CONTENT THEREOF - A mobile communication terminal and method thereof capable of generating content data according to a synchronization scheme suitable for a mobile environment are provided, which allow a user to simply create and share content. The method includes receiving a user input or selection instruction for a plurality of content; determining whether there is sound data among the plurality of content; and if there is sound data, generating content data by synchronizing first content to be displayed while the sound data is played among the plurality of content to first segment data.08-22-2013
20130215011ELECTRONIC DEVICE AND METHOD FOR CONTROLLING THE SAME - An electronic device and a method for controlling the same are disclosed. The electronic device has a flexible display screen, and the flexible display screen has a first display region. The method for controlling the electronic device includes: detecting whether the flexible display screen is bent to generate detection information; and dividing a first display region into a first display sub-region and a second display sub-region in the case that the detection information indicates that the flexible display screen is bent, where the first display sub-region is independent of the second display sub-region.08-22-2013
20130215008PORTABLE ELECTRONIC DEVICE AND CONTROL METHOD THEREOF - A portable electronic device including a base, a lid and a hinge is provided. The base has a control unit and a proximity sensor. The proximity sensor is electrically connected to the control unit. The lid has a panel electrically connected to the control unit. The hinge is used for connecting the base with the lid. When the proximity sensor detects that the angle between the lid and the base exceeds a predetermined angle, the control unit performs a predetermined operation.08-22-2013
20080291159Computer Input Device and Method for Operating the Same - An computer input device with context-awareness functionality generates a lighting pattern in response to a context of data inputting, audio, new message receiving, arranged event or system information in a computer system.11-27-2008
20080291156Sanitary User Interface - A user interface is configured to detect an attempt to touch a virtual button. The interface includes a first concave mirror facing a second concave mirror. The second concave mirror includes an aperture. A physical control button is arranged proximate to the first mirror and aligned such that an image of the control button in a form of a virtual button appears at the aperture. An attempt to touch the virtual button is detected, and feedback is generated at the virtual button in response to detecting the attempt to touch the virtual button by a user.11-27-2008
20090002314Tactile sense presentation device and tactile sense presentation method - A tactile sense presentation device is disclosed that drives a tactile sense unit to present a tactile sense to an operator. The device includes a location detection unit that detects the location of the tactile sense unit and a drive unit that drives the tactile sense unit with a thrust in accordance with the location detected by the location detection unit. The device controls the direction and the size of the thrust applied to the tactile sense unit in accordance with the location of the tactile sense unit detected by the location detection unit.01-01-2009
20090102785Six-Direction Button - The six-direction button includes a base, a rotation detector having a rotation pin disposed on the base, a button cover coupled with the rotation pin, and four pressure detectors disposed on the base under the button cover. The button cover and the rotation pin may be rotated clockwise or counterclockwise. The pressure detectors are arranged according to an up side, a down side, a left side, and a right side of the button. The cover button cover may touch one of the four pressure detectors when the button cover is pressed.04-23-2009
20090213067INTERACTING WITH A COMPUTER VIA INTERACTION WITH A PROJECTED IMAGE - Embodiments of the present invention address deficiencies of the art in respect to user interfaces and provide a novel and non-obvious system for interacting with a computer via a projected image. In one embodiment of the invention, the system includes a projector for generating a projected image onto a surface, wherein the projected image corresponds to a first image on a display of the computer. The system further includes a sensor for sensing a human interaction with the projected image and generating a first information representing the human interaction and a transmitter for transmitting the first information to the computer. The system further includes a program on the computer that receives the first information and translates it into a second information representing a human interaction with the first image.08-27-2009
20090213068Ergonomic Pointing Device - An input device for a computer is described that positions the user's hand in a more ergonomically desirable position, i.e., at an angle of about 45° to the work surface. In preferred embodiments, the input device accommodates either a user's left or right hand, and in either case, positions the hand in an ergonomically desirable position. In another embodiment, the length of the input device of the present invention is adjusted for the size of the user's hand. In further embodiments, the input device of the present invention provides a palm rest. Other desirable features included in preferred embodiments include lateral buttons that are positioned one above the other.08-27-2009
20110227822FLEXIBLE DEVICES AND RELATED METHODS OF USE - Disclosed are devices which include flexible display sheets or other flexible elements, whereas by physically manipulating said flexible display sheets or elements, interaction with said device may be facilitated. Flexibility features may be employed to provide methods of interaction which include manipulating flexible section. Some of said methods of interaction are disclosed. Further disclosed are units which can connect to flexible display sheets, for interacting with said flexible display sheets.09-22-2011
20090251407DEVICE INTERACTION WITH COMBINATION OF RINGS - The claimed subject matter provides a system and/or a method that facilitates interacting with a device and/or data associated with the device. A computing device can display a portion of data. A ring component can interact with the portion of data to control the device by detecting at least one of a movement, a gesture, an inductance, or a resistance related to a user wearing the ring component on at least one digit on at least one hand.10-08-2009
20090251408IMAGE DISPLAY DEVICE AND METHOD OF DISPLAYING IMAGE - Under an LCD panel (10-08-2009
20120139831METHOD AND APPARATUS FOR SWITCHING INPUT METHODS OF A MOBILE TERMINAL - The present invention provides a method and apparatus for switching input methods of a mobile terminal, the method comprises: Step A: receiving an operation instruction from a key to call out an input method selecting interface; Step B: reading a selection of an input method in the input method selecting interface; Step C: entering an editing environment of the selected input method; Step D: judging whether a plurality of input characters, which are not yet interpreted before switching the input method, are meaningful to the input method after switching; and Step E: if the plurality of input characters are meaningful, then interpreting the input character which is not yet interpreted by using the input method after switching. A user can input information into a mobile terminal more conveniently and rapidly through the present invention, and the input speed and user experience are improved.06-07-2012
20090256800VIRTUAL REALITY SIMULATOR HARNESS SYSTEMS - The inventions are directed to assemblies for interfacing three-dimensional movements of a person to a virtual environment or to a remote environment. The harness assemblies maintain the user in a desired location with respect to the virtual reality system thereby allowing the virtual reality system to capture the movements of the user The assemblies include a frame subsystem, a pivot subsystem, a cable management subsystem, a compliance subsystem, a vertical motion subsystem, a centering adjustment subsystem, a support arm subsystem, and a human restraint subsystem.10-15-2009
20090256801SYSTEM AND METHOD THAT GENERATES OUTPUTS - The present invention relates to a system and method that generates outputs based on the operating position of a sensor which is determined by the biomechanical positions or gestures of individual operators. The system including a garment on which one or more than one sensor is removably attached and the sensors provide a signal based on the biomechanical position, movement, action or gestures of the person wearing the garment, a transmitter receiving signals from the sensors and sends signals to a computer that is calibrated to recognise the signals as representing particular positions that are assigned selected outputs. Suitably the outputs are audio outputs of an instrument, such as a guitar, and the outputs simulate the sound of a guitar that would be played when the biomechanical motion, action, gesture or position of the operator resembles those that would occur when an actual instrument is played.10-15-2009
20100149090GESTURES, INTERACTIONS, AND COMMON GROUND IN A SURFACE COMPUTING ENVIRONMENT - Aspects relate to detecting gestures that relate to a desired action, wherein the detected gestures are common across users and/or devices within a surface computing environment. Inferred intentions and goals based on context, history, affordances, and objects are employed to interpret gestures. Where there is uncertainty in intention of the gestures for a single device or across multiple devices, independent or coordinated communication of uncertainty or engagement of users through signaling and/or information gathering can occur.06-17-2010
20090243999DATA PROCESSING DEVICE - A Hard Disk Drive HDD stores a program for causing a CPU to execute driver seat processing as processing corresponding to presence coordinates (x(k), y(k)) of an object when the presence direction of the object determined on the basis of detection values of the proximity sensors is the direction to the driver seat and passenger seat processing as processing corresponding to the presence coordinates (x(k), y(k)) of the object when the presence direction of the object is the direction to the passenger seat. The HDD also stores driver seat operation item data, passenger seat operation item data, a driver seat operation table, a passenger seat operation table, and the like.10-01-2009
20120194424SYSTEM AND METHOD FOR NAVIGATING A MOBILE DEVICE USER INTERFACE WITH A DIRECTIONAL SENSING DEVICE - An electronic mobile device includes a display for displaying a graphical element. A tilt sensor is configured to sense first and second tilt angles of the mobile device. A processor is coupled to the display and the tilt sensor and configured to move the graphical element relative to the display in a first direction based on the first tilt angle, and to move the graphical element relative to the display in a second direction based on the second tilt angle.08-02-2012
20120194423MULTI COLOURS DEVICE ILLUMINATION - A portable device having within it a multicolour illumination arrangement comprising: a surface; a plurality of light sources, at least one of the plurality of light sources being capable of generating two or more emission colours; and drive means for causing the emission colour of the at least one light source to vary; whereby the illumination arrangement can produce a varying illumination through at least part of the surface.08-02-2012
20120194422METHOD AND SYSTEM FOR VISION-BASED INTERACTION IN A VIRTUAL ENVIRONMENT - Method, computer program and system for tracking movement of a subject. The method includes receiving data from a distributed network of camera sensors employing one or more emitted light sources associated with one or more of the one or more camera sensors to generate a volumetric three-dimensional representation of the subject, identifying a plurality of clusters within the volumetric three-dimensional representation that correspond to motion features indicative of movement of the motion features of the subject, presenting one or more objects on one or more three dimensional display screens, and using the plurality of fixed position sensors to track motion of the motion features of the subject and track manipulation of the motion features of the volumetric three-dimensional representation to determine interaction of one or more of the motion features of the subject and one or more of the one or more objects on the three dimensional display.08-02-2012
20120194421TERMINAL AND METHOD FOR PROVIDING USER INTERFACE LINKED TO TERMINAL POSTURE - A terminal and method for providing user interfaces linked to postures of a terminal, whereby a key input signal is received from any one of function keys located at both sides of the terminal, a posture of the terminal is determined in response to the received key input signal, a user interface screen corresponding to the determined posture of the terminal is provided, and settings of the terminal are changed according to the determined posture of the terminal, thereby increasing convenience of a user interface without use of a complicated sensor system.08-02-2012
20120194420AR GLASSES WITH EVENT TRIGGERED USER ACTION CONTROL OF AR EYEPIECE FACILITY - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes event triggered user action control.08-02-2012
20120194419AR GLASSES WITH EVENT AND USER ACTION CONTROL OF EXTERNAL APPLICATIONS - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes event and user action control of external applications.08-02-2012
20120194418AR GLASSES WITH USER ACTION CONTROL AND EVENT INPUT BASED CONTROL OF EYEPIECE APPLICATION - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes user action control and event input based control of an eyepiece application.08-02-2012
20120194416ELECTRONIC APPARATUS AND METHOD OF CONTROLLING ELECTRONIC APPARATUS - One embodiment provides an electronic apparatus, including: a direction detector disposed in a housing which accommodates electronic components therein and configured to detect a direction of the housing; and a power-saving-mode shift controller configured to control a shift to a power saving mode upon detection of a turned-down direction of the housing.08-02-2012
20120105316Display Apparatus - The display apparatus has a displaying unit which displays an image, a first specifying unit which specifies direction of the person in the image captured by a plurality of imaging apparatuses arranged on different positions; a second specifying unit which specifies one of the plurality of imaging apparatuses for capturing and for displaying an image, in accordance with direction instructed by user and the direction specified by the first specifying unit.05-03-2012
20120105315Virtual Controller For Visual Displays - Virtual controllers for visual displays are described. In one implementation, a camera captures an image of hands against a background. The image is segmented into hand areas and background areas. Various hand and finger gestures isolate parts of the background into independent areas, which are then assigned control parameters for manipulating the visual display. Multiple control parameters can be associated with attributes of multiple independent areas formed by two hands, for advanced control including simultaneous functions of clicking, selecting, executing, horizontal movement, vertical movement, scrolling, dragging, rotational movement, zooming, maximizing, minimizing, executing file functions, and executing menu choices.05-03-2012
20120105314METHOD AND SYSTEM FOR INPUTTING INFORMATION USING ULTRASONIC SIGNALS - Provided are an information input system and an information input method using an ultrasonic signal. In the information input system or method, a signal generation apparatus for generating a reference signal and an ultrasonic signal or an information input apparatus for generating a reference signal receives a reference signal and an ultrasonic signal which are generated by a different signal generation apparatus which inputs information in an adjacent area or the information input apparatus, determines whether or not there is a possibility of interference of the signals generated by the different signal generation apparatus or the information input apparatus, and adaptively changes an effective section where the reference signal and the ultrasonic signal are generated to an idle section in the same period or the next period if it is determined that there is a possibility of the interference, so that the signal interference is prevented. Therefore, even in the environment where a plurality of the information input systems are present, it is possible to stably input information.05-03-2012
20100013759KVM SWITCH WITH SEPARATE ON-SCREEN DISPLAY AND CONTROL CHANNELS - The present invention relates to a KVM switch with separate on-screen display and control channels, particularly to the KVM switch is connected to a plurality of computers, monitors and operation devices. The KVM switch comprises a switch circuit connected with the computers, a video switch circuit connected with the switch circuit for sending a signal to the monitors, a separate control system connected to the video switch circuit for receiving a function command from the operation devices, and a video control and display unit connected to the separate control system for producing and sending a video signal and synchronous signal to the monitors through the video switch circuit. Because the video control and display unit can transmit the video signal containing OSD (on-screen display) functions and display contents to the video switch circuit through a separate channel, and the separate control system also sends a switching signal to the video switch circuit to shut off the video signal form the computers, therefore, an OSD (on-screen display) image displayed on the monitors has fixed settings and facilitates a user to view and operate.01-21-2010
20100013757IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD - The operator of an image processing apparatus can do the operation of searching for the desired image parts from a series of images also the editing operation with ease. The image processing apparatus includes an image generating means for generating display video data of a plurality of images to be displayed respectively in a plurality of image display sections on a display screen from image data, a display type determining means for determining display types indicating display modes of displaying the images of the image data on a picture by picture basis or GOP by GOP basis according to variations expressing extents of change of the image data, a parameter altering means for altering the display parameters or the reproduction parameters corresponding to the display video data according to the type information expressing the display types on a picture by picture basis or GOP by GOP basis as determined by the display type determining means and an image processing means for displaying the images to be displayed in the form of moving image on the display screen with time lags in the display sequence, using the display parameters or the reproduction parameters altered by the parameter altering means.01-21-2010
20090315825 INFRARED VIRTUAL, INVISIBLE COMPUTER KEYBOARD AND MOUSE - New design for an infrared virtual, invisible computer keyboard and mouse for a mice and keyboard less computer is presented. The current invention uses the fact that the infrared spectrum of human fingers can be changed when irradiated with low power diodes. Using this fact, the present invention presents a method where a human finger's infrared spectrum, irradiated by an array of infrared diodes, is picked up by infrared sensors installed either in stand alone mode, on top of the computer display or directly into the computer display. The computer then uses the finger infrared spectrums picked by the infrared sensors to created virtual, invisible mouse and keyboard where words can drawn in the air and mouse commands are also given by moving the finger in different directions in the air.12-24-2009
20100265170INPUT DEVICE, INFORMATION TERMINAL PROVIDED WITH THE SAME AND INPUT METHOD - In an input device used in mobile apparatuses in which portability is considered important and in mobile apparatuses in which a display unit such as a display is considered important, even if an input unit of the apparatus is made small in size, the input device is configured so that an input can be carried out without requiring an operator for skill. The input device is provided with the input unit including a detecting unit that, when a part of a living body in contact with the input device is pushed, detects a force transmitted through the living body and outputs detection data, and an input information specifying module that, when receiving the detection data, refers to stored data at a database, specifies a position where the living body is pushed, and outputs data allotted to the position as input information of electronic data.10-21-2010
20100149094Snow Globe Interface for Electronic Weather Report - A portable computing device for displaying weather information. The device includes a transceiver configured to send and receive weather information, a display controller configured to generate a weather scene display including the weather information based on a received shaking input provide to the portable computing device and a display configured to present the generated weather scene display. The display controller is configured to display flitter configured to obscure the generated weather scene display during the receiving of the weather information and/or updating of the generated weather scene display.06-17-2010
20110128219INTERACTIVE INPUT SYSTEM AND BEZEL THEREFOR - An interactive input system comprises at least one imaging device having a field of view looking into a region of interest. At least one radiation source emits radiation into the region of interest. A pliable bezel at least partially surrounds the region of interest. The pliable bezel has a reflective surface in the field of view of said at least one imaging device.06-02-2011
20090122007INPUT DEVICE, CONTROL METHOD OF INPUT DEVICE, AND PROGRAM - Disclosed herein is an input device including, a target creating section a performing section, and a height information generating section wherein the target creating section generates the information on the target to which information the height information is added, and the performing section performs predetermined processing on a basis of the height information added to the information on the target.05-14-2009
20100156782Hand Control Image For Replication - A system and method for a display in the gaze-forward position of a vehicle operator, showing hand position relative to various control inputs or hand positions relative to a control surface. The hand replication system includes a control input device, a display displaying an image of the control input device, an optical input device directed to capturing an actual image of the control input device and providing the actual image output to one of a display or a processor. The method of providing an image of an operators hand relative to a control panel within the line of sight of the vehicle operator includes the steps of providing a display, providing an optical input device, obtaining an actual image of the control panel with the optical input device, and providing the actual image of the control panel to the display.06-24-2010
20100156781EYE GAZE CONTROL DURING AVATAR-BASED COMMUNICATION - An avatar image on a device display, such as a cell phone or laptop computer, maintains natural and realistic eye contact with a user (a human being) while the user is communicating with the other human being, whose avatar is displayed on the device. Thus, the user and the avatar have natural eye contact during the communication session (e.g., phone call). Modules within the device ensure that the avatar eyes do not maintain a fixed gaze or stare at the user constantly and that the avatar looks away and changes head and eye angles in a natural manner. An imager in the device captures images of the user and tracks the user's eyes. This data is inputted to an avatar display control module on the device which processes the data, factors in randomness variables, and creates control signals that are sent to the avatar image display component on the device. The control signals instruct how the avatar eyes should be positioned.06-24-2010
20120139830APPARATUS AND METHOD FOR CONTROLLING AVATAR USING EXPRESSION CONTROL POINT - An apparatus and method for controlling an avatar using expression control points are provided. The apparatus may track positions of feature points and a head movement from an expression of a user, remove the head movement from the positions of the feature points, and generate expression control points. Accordingly, the expression of the user may be reflected to a face of the avatar.06-07-2012
20100182231INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING PROGRAM - An information processing apparatus includes a sensor portion, a judgment portion, and an output portion. The sensor portion detects three-dimensional coordinates designated by a spatially-apart detection target object. The judgment portion determines an area designated in advance, that includes the three-dimensional coordinates detected by the sensor portion. The output portion outputs, based on a result of the judgment by the judgment portion, audio corresponding to audio information from a position corresponding to at least two-dimensional coordinates out of the three-dimensional coordinates.07-22-2010
20090079690METHOD AND APPARATUS FOR ENHANCING ENTERTAINMENT SOFTWARE THROUGH HAPTIC INSERTION - A method for enhancing entertainment through haptic insertion includes monitoring signal(s) during the execution of entertainment software, recognizing that the monitored signal(s) satisfy predetermined criteria, and generating a haptic control signal in response to enhance an entertainment experience. Monitored signals may include, for example, audio signals, video signals, data signals, control signals, and the like. Entertainment software may include, for example, a video game, an audio-visual work, an audio work, and the like. A device for enhancing entertainment software through haptic insertion includes at least one processors and an output unit coupled to the processor(s) and including a haptic control output. The processor(s) are configured to monitor at least one signal during the execution of entertainment software, to recognize that the monitored signal(s) satisfy a predetermined criterion, to generate a haptic control signal in response to such recognition, and to output the generated haptic control signal through the haptic control output of the output unit.03-26-2009
20090289893Finger appliance for data entry in electronic devices - The appliance is adapted to be mounted on a finger having a finger tip and a finger pad. It includes a finger engaging portion having first and second side members including oppositely oriented, spaced arcuate members and a force application member extending between the first and second side members of the finger engaging portion and abutting the finger tip. The force application member has a blunt surface adapted to contact a key of an electronic device. The blunt surface may have one or more ribs to provide a high friction surface. The appliance defines an opening aligned with the finger pad, so as not to obstruct the finger pad when mounted on the finger.11-26-2009
20120032878INPUT APPARATUS USING A CONDUCTIVE RUBBER MEMBER - A data input apparatus using a conductive rubber member is provided. The apparatus includes a conductive rubber member, to one end of which voltage is input and though the other end of which voltage reduced in proportion to the internal resistance and the length thereof is output; a voltage output member which is brought into contact with the conductive rubber member to output the voltage value of the conductive rubber member at the contact point; and a control unit which recognizes the contact point based on the voltage value input from the voltage output member, extracts data corresponding to the contact point from a memory unit, and inputs the extracted data.02-09-2012
20100188327ELECTRONIC DEVICE WITH HAPTIC FEEDBACK - Haptic feedback may be provided to a user of an electronic device, such as an electronic book reader device, to confirm receipt of user input or otherwise convey information to the user. The haptic feedback may be provided more quickly than a display update time of a display of the electronic device. Different patterns, durations, and/or intensities of haptic feedback may be used in response to different events.07-29-2010
20100188330INPUT DEVICE - The image sensor 07-29-2010
20100194683MULTIPLE SCREEN DISPLAY DEVICE AND METHOD - Image browsing method and display device having a body with a plurality of display faces according to different planes, a plurality of display screens able to simultaneously display different digital images, the screens being respectively on different display faces of the body, image selection means for selecting a plurality of digital images in an image collection to be displayed on the screens; and motion sensors connected to the image selection means to trigger a display change, the display change comprising the replacement of the display of at least one image on at least one of the display screens by another image from the image collection, as a function of the device motion.08-05-2010
20100271296USER INTERFACE POWERED VIA AN INDUCTIVE COUPLING - An operating machine, such as a medical machine or a dialysis machine, includes a housing for operating components of the machine and a moveable user interface or display for viewing and entering information concerning operation of the machine. Signals concerning operating information are wirelessly transmitted between the machine and the display using one of several techniques. Power is also transmitted wirelessly from the operating machine to the screen, or from a separate power source to the display. The wireless signals may be transmitted via induction, radio, infrared or optical means.10-28-2010
20120032875Scanned Image Projection System Employing Beam Folding Apparatus - An imaging system (02-09-2012
20100177034PORTABLE STORAGE DEVICE HAVING USER INTERFACE AND METHOD OF CONTROLLING THE USER INTERFACE - A portable storage device and a method of controlling a user interface (UI) using the same. The method includes obtaining first UI information using a UI when the portable storage device is connected to a host, transmitting the obtained first UI information to an application of the host, and displaying second UI information provided by the application of the host using the UI.07-15-2010
20090295711MOTION CAPTURE SYSTEM AND METHOD FOR THREE-DIMENSIONAL RECONFIGURING OF CHARACTERISTIC POINT IN MOTION CAPTURE SYSTEM - In an optical motion capture system, it is possible to measure spatially high-dense data by increasing the number of measuring points. In the motion capture system using a mesh marker, intersections of lines for the mesh marker are called nodes and the lines connecting the nodes are called edges. The system includes a plurality of cameras for capturing a two-dimensional image of the mesh marker by imaging a subject having the mesh marker, a node/edge detecting section for detecting node/edge information on the mesh marker from the two-dimensional image captured by the respective cameras, and a three-dimensional reconstructing section for acquiring three-dimensional position information of the nodes by using the node/edge information detected from the plurality of two-dimensional images captured by different cameras.12-03-2009
20100171692Input device and display device - An input device and a display device are provided. The input device may receive object information, associated with an object displayed on a display device, from the display device, sense at least one motion of a user, analyze the at least one sensed motion of the user based on the object information, and transmit the analysis result of the analysis unit to the display device, and the display device may receive the analysis result and control the object based on the analysis result.07-08-2010
20100182228DISPLAY CONTROLLING PROGRAM AND DISPLAY CONTROLLING APPARATUS - An information processing apparatus includes a computer. The computer subsequently images a user, makes an evaluation of first image data indicating an image obtained by subsequently imaging, and displays the evaluation result on an LCD by subsequently updating the same.07-22-2010
20100182230TERMINAL, FUNCTION STARTING-UP METHOD AND PROGRAM FOR TERMINAL - A terminal includes a display unit and a character conversion unit that recognizes a function related to entered characters in a character acceptable state, converts the entered characters to a symbol to be displayed on the display unit for starting the recognized function, and outputs the symbol. The terminal further includes a control unit that starts the function corresponding to the symbol displayed on the display unit.07-22-2010
20090122006ENHANCED PROTOCOL AND ARCHITECTURE FOR LOW BANDWIDTH FORCE FEEDBACK GAME CONTROLLER - Haptic features are stored in a haptic device by preloading or otherwise downloading them, e.g., wirelessly, into the haptic device at the time of manufacture, immediately prior to game play, during game play, and/or at any other time. Haptic features may be activated, deactivated, modified or replaced at any time. All or a subset of the haptic features may be selected as an active play list, which may be modified as necessary. A host may manage some or all device memory and the haptic features stored therein. Haptic features stored in haptic devices and control information provided by the host are used by the haptic device to execute haptic effects. The haptic device may sustain haptic effects between control messages from the host. New communication messages may be added to an underlying communication protocol to support haptic effects. New messages may use header portions of communication packets as payload portions.05-14-2009
20100188328ENVIRONMENTAL GESTURE RECOGNITION - A data-holding subsystem. The data-holding subsystem includes instructions stored thereon that when executed by a logic subsystem in communication with the data-holding subsystem: receive one or more signals, determine a sensor type for each signal of the one or more signals, identify a sensor type specific pattern corresponding to a motion gesture in at least one of the signals, and generate a gesture message based on the motion gesture. The gesture message may be usable by an operating system of a computing device that includes the data-holding subsystem to provide a system-wide function usable by one or more application programs of the computing device to provide an application specific function.07-29-2010
20100188329PUSH BUTTON SWITCH DEVICE WITH AN OLED DISPLAY - A push button switch device with an OLED display includes a support casing for a switch, an actuating member movably assembled in the casing so as to actuate the switch, and a display assembled at an operating end of the actuating member viewable from the outside. The display is programmable so as to display images and is formed by an assembly of organic light emitting diodes, or OLEDs. The device furthermore includes a local control unit with a memory storing predetermined images and/or portions of video. The local control unit includes a field-programmable gate array, or FPGA, connected to the memory and to the display so as to control the selective switching on/off of the OLEDs and to thus display in the display at least part of the images and/or portions of video according to a certain sequence.07-29-2010
20100188325IMAGE DISPLAY DEVICE AND IMAGE DISPLAY METHOD - An image display device adjusts display luminance in accordance with a surrounding illuminance. Since the display luminance considered to be appropriate differs depending on a user, the image display device includes a display unit operating according to a display luminance calculated based on the surrounding illuminance. For example, in the image display device, a user can set a range of the display luminance as a predetermined rule. The image display device provides a user interface enabling a user to easily input the predetermined rule or perform the setting input without being aware of it. That is, when the surrounding illuminance is above a predetermined illuminance, an input bar indicates only the range where the maximum value can be set as an input screen and when the surrounding illuminance is below the predetermined illuminance, the input bar indicates only the range where the minimum value can be set.07-29-2010
20100001950POSITION DETERMINATION UTILIZING A CORDLESS DEVICE - A system for generating position information includes a reflector, an image collection system, and a processor. The image collection system is configured to collect at least two sets of image data, where one set of image data includes a stronger indication of the reflector than the other set of image data. The two sets of image data can be collected in many different ways and may include using a retroreflector as the reflector. The two sets of image data are used to generate position information related to the reflector. In particular, position information related to the reflector is generated by taking the difference between the two set of image data. Because one set of image data includes a stronger indication of the reflector than the other set of image data, the difference between the two sets of image data gives a definitive indication of the reflector's position.01-07-2010
20100001949Spatially Aware Inference Logic - A method, system, and article to support a motion based input system. Movement data is acquired from a motion sensor. An orientation detector detects orientation towards gravity from a rest position, and a motion detector detects motion, including movement and rest. In addition, an inference state machine in communication with the orientation and motion detectors maintains a sequence of the detected motion conditions, and produces a profile description for the sequence of the detected motion conditions. An output event corresponding to the profile description is generated based upon the profile.01-07-2010
20100001947Images Display Method and Apparatus of Digital Photo Frame - The invention describes a digital photo frame which has a motion sensor module. The digital photo frame further comprises a microprocessor, a memory unit, and a display module. User can change photo images displayed by simply shaking the digital photo frame. Any acceleration movements, acceleration changes, gravity changes, tilt, shake, and position changes of the digital photo frame, are detected by the motion sensor module, and the motion sensor module generates signals to the microprocessor for further calculation and interpretation. The microprocessor will then change the displayed photo image to the next image or to the previous image according to the calculation result of the signals.01-07-2010
20100259471CONTROL DEVICE, HEAD-MOUNT DISPLAY DEVICE, PROGRAM, AND CONTROL METHOD - It is possible to provide a technique for accurately performing an operation desired by a user. A user's head operation is identified according to information detected by a head motion detection unit. A process desired by the user is executed according to an angular velocity of the head motion. Moreover, the technique uses a control unit which can accurately execute an operation by the user's head operation without reflecting the return motion of the user's head in the process. The control unit executes a process for a start and an end of each process corresponding to the detected angular velocity according to a predetermined threshold value.10-14-2010
20100259472INPUT DEVICE - An input device (10-14-2010
20100259473USER INTERFACE DEVICE, USER INTERFACE METHOD, AND RECORDING MEDIUM - A user interface device (10-14-2010
20100225578Method for Switching Multi-Functional Modes of Flexible Panel and Calibrating the Same - A method for switching multi-functional modes and calibrating an electronic device and the electronic device using the method are disclosed. The method for switching multi-functional modes comprises: detecting at least one sensing device to identify a specific shape of a flexible panel of the electronic device; and matching the specific shape with the multi-functional modes according to a corresponding table so as to execute one specific functional mode. The corresponding table comprises a corresponding relationship between the specific shapes of the flexible panel and the specific functional modes, and a specific functional mode executed by the electronic device corresponds to the specific shape according to the corresponding relationship.09-09-2010
20090046059FINGER POINTING APPARATUS - A finger pointing apparatus is disclosed. In one aspect the finger pointing apparatus includes at least one pressure sensor fixed on a hand for triggering a corresponding electromagnetic wave transmitter to transmit electromagnetic wave when pressure is produced by a finger contacting an external object. In another aspect, the finger pointing apparatus includes at least one electromagnetic wave transmitter connected with a corresponding pressure sensor and fixed on the hand for transmitting electromagnetic wave to all electromagnetic wave receivers when pressure is detected by the pressure sensor. In one aspect, the finger pointing apparatus includes at least two electromagnetic wave receivers arranged at fixed positions with respect to each other for receiving electromagnetic wave from the at least one electromagnetic wave transmitter and transmitting received electromagnetic wave to a microprocessor. In another aspect, the finger pointing apparatus includes a microprocessor for receiving electromagnetic wave from the electromagnetic wave receivers, calculating coordinate values of a position pressed by the finger from electromagnetic wave from different electromagnetic wave receivers and outputting the coordinate values.02-19-2009
20090046057IMAGE FORMING APPARATUS, DISPLAY PROCESSING APPARATUS, DISPLAY PROCESSING METHOD, AND COMPUTER PROGRAM PRODUCT - An MFP includes a selection receiving unit that receives selection by a user of a desired one of higher setting items displayed on an operation panel, and a display processing unit that displays, when the selection of the higher setting item is received, intermediate setting items corresponding to the selected higher setting item and overview information indicating overviews of lower setting items corresponding to the intermediate setting items, being associated with each other, on the operation panel.02-19-2009
20090046054Resistive Actuator With Dynamic Variations Of Frictional Forces - A system for generating haptic effects on a rotary knob includes an electrical coil and a core. A first level of voltage is applied to the coil to enable a first surface interface having a first coefficient of friction and to generate a first haptic effect by varying the voltage. A second level of voltage is applied to the coil to enable a second surface interface having a second coefficient of friction that is greater than the first coefficient of friction and to generate a second haptic effect by varying the voltage.02-19-2009
20130215005METHOD FOR ADAPTIVE INTERACTION WITH A LEGACY SOFTWARE APPLICATION - Methods are disclosed to support adaptive interaction with legacy software applications, without a need for rewriting those applications. The methods are for use with an interactive electronic system including a processor, a display, and an input device with user-manipulated controls. When the legacy application is executed, a supplemental software program, such as a plugin, is also executed and is utilized in order to identify currently relevant interactive features of the legacy application during execution. Functionality is dynamically assigned to the various user-manipulated controls based on the identified features. In one embodiment, detection of objects (particularly the user's hands) proximate to the input controls is also employed in determining the assignment of functionality and/or in displaying a visual representation to the user of the available interactive choices. In another embodiment, the user-manipulated input controls are dynamically and physically reconfigured under control of the processor based on the identified features.08-22-2013
20130215009INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND A COMPUTER PROGRAM PRODUCT - An information processing apparatus, method and computer program product determine an object range to be displayed on a display based on a detected user-related action. For the apparatus, a control unit determines content to be displayed within an object range on a map. An action recognition processing unit that detects a user-related action. The control unit determines the content to be displayed within the object range. The object range includes a current position of the information processing apparatus, and a coverage area of the object range is based on the user-related action detected by the action recognition processing unit.08-22-2013
20130215010PORTABLE ELECTRONIC EQUIPMENT AND METHOD OF VISUALIZING SOUND - A portable electronic equipment comprises an optical output device and a controller configured to receive a sound signal and visual environment data, the visual environment data representing an environment of the portable electronic equipment. The controller is configured to process the received sound signal to identify sound characteristics of the sound signal, to generate graphics based on both the received visual environment data and the identified sound characteristics, and to control the optical output device to output the generated graphics, wherein a location at which the generated graphics is output on the optical output device is controlled based on the received visual environment data.08-22-2013
20100177036 COVER FOR PORTABLE TERMINAL - A cover for a portable terminal having a fixing part for fixing the portable terminal and a folder rotating at the fixing part to open and close the portable terminal. The cover includes an input unit and/or a display unit, and a charging unit. The input unit inputs data to the portable terminal. The display unit displays data from the portable terminal. The input unit and display unit are constructed in the folder. The charging unit is constructed in at least one of the fixing part and folder, and supplies power to the portable terminal generated using a solar cell.07-15-2010
20100177033BUTTON WITH EDGE MOUNTED PIVOT POINT - A device disclosed herein reduces inadvertent activation of buttons mounted on edges of an electronic device. The button moves around an axis of rotation proximate to, and parallel with, the adjacent edge of the device. Thus, inadvertent pressure on an edge of the button adjacent to the edge of the device will not inadvertently activate the button. Additionally, the presence of the axis of rotation proximate to the edge maintains a desired reveal.07-15-2010
20100177038PORTABLE DEVICE - According to the portable device of an aspect of the present invention, the first enclosure and the second enclosure are brought into movable linkage between the first position where the silhouettes of the first enclosure and the second enclosure are overlapped and the second position where the second enclosure is moved in parallel from the first position. In addition, the first enclosure and the second enclosure are rotatably and movably linked between the first position and the third position where the second enclosure is rotatably moved from the first position at a predetermined angle.07-15-2010
20100238110LOCOMOTION INTERFACE DEVICE FOR INVOLVING BIPEDAL MOVEMENT IN CONTROL OVER COMPUTER OR VIDEO MEDIA - A locomotion interface includes a first section for a user contact with lower extremities and a second section proximate to the first section. The second section includes a first action region for the user to contact and move a lower extremity over. The locomotion interface also includes a first plurality of sensors for detecting this motion in the vicinity of the first action region. The locomotion interface typically includes a second action region and a second plurality of sensors. During operation the user contacts and moves a lower extremity over the second action region. The second plurality of sensors is positioned to detect this motion in the vicinity of the second action region.09-23-2010
20100238109USER INTERFACE FOR SET TOP BOX - A method for control comprises a set top box receiving coordinates from a touch sensing screen. The coordinates are interpreted for controlling the set top box, and in accordance with the interpreted coordinates an action is performed. A further method for control comprises a set top box receiving a signal representative of displacement. A control function is determined from the displacement representative signal and the control function is activated. In accordance with the control function a signal is formed for communication.09-23-2010
20100225577Devices And Associated Hinge Mechanisms - A device comprising first and second housings arranged to swivel about a swivel axis, wherein the device is arranged such that relative turning of the first and second housings about the swivel axis in a first direction reveals a first device user operational area, and relative turning of the first and second housings in the second opposing direction reveals a second device user operational area.09-09-2010
20100238108LIGHT-TACTILITY CONVERSION SYSTEM, AND METHOD FOR PROVIDING TACTILE FEEDBACK - A light-tactility conversion system is provided which includes a light emitting device including an illumination unit capable of emitting light at a same time to a plurality of illumination areas in different illumination patterns, and an illumination control unit for controlling the illumination unit and making the illumination unit project an image, and also for controlling the illumination patterns in units of pixels of the projected image and making the illumination unit emit light to specific illumination areas in specific illumination patterns, and a vibration device including an illumination pattern detection unit for detecting an illumination pattern of light received from the light emitting device, and a vibration control unit for generating a vibration pattern corresponding to the illumination pattern detected by the illumination pattern detection unit and vibrating an oscillator in the vibration pattern.09-23-2010
20100238107Information Processing Apparatus, Information Processing Method, and Program - There is provided an information processing apparatus, including: a display panel for displaying at least one object in a selected state or in unselected state; a detection area setting unit for setting, per object on the display panel, a first detection area covering a display area of the object and a second detection area covering the first detection area and being larger than the first detection area; an operating tool detecting unit for detecting an operating tool which is in proximity to the display panel; and a state managing unit for changing the object into the selected state when the operating tool is detected within the first detection area of the object in the unselected state, and for changing the object into the unselected state when the operating tool is not detected within the second detection area of the object in the selected state.09-23-2010
20100225576THREE-DIMENSIONAL INTERACTIVE SYSTEM AND METHOD - The present invention relates to a method for providing an intuitive interactive control object in stereoscope comprising the steps of: (a) providing a display capable of displaying in stereoscope; (b) providing a system capable of motion tracking; (c) tracking a visual signal motion performed by a user; (d) providing a stereoscopic image of a remote control, on said display in response to said signal performed by said user; (e) tracking user's motion aimed at interacting with said displayed stereoscopic image of said remote control; (f) analyzing said user's interactive motion; and (g) performing in accordance with said user's interactive motion.09-09-2010
20120194425INTERACTIVE SELECTION OF A REGION OF INTEREST IN AN IMAGE - A system for selecting a region of interest in an image is provided. A user interface (08-02-2012
20100253617Portable Electronic Apparatus and Control Method of Portable Electronic Apparatus - A portable electronic apparatus and a control method of the portable electronic apparatus with excellent operability capable of surely reflecting a user's intended input operation are provided. The portable electronic apparatus and the control method comprise: a first sensor group G10-07-2010
20110057876POINTING DEVICE - A pointing device includes a first ground potential electrode; a second electrode for applying a voltage; a third electrode for measuring an electrical potential; a printed circuit board on which the first through the third electrodes are provided; a location pointing driving body that is provided on the printed circuit board and that is configured with a conductive part and that contacts the first and second electrodes, and a spherical part; a slide member that is located to cover a top part of the location pointing driving body and that is configured to drive the location pointing driving body by being slidable within a plane parallel to the printed circuit board; and a pressing force restriction member that is configured to restrict pressing force from the spherical part to the printed circuit board by receiving force from the slide member in the pressing direction.03-10-2011
20110057872PORTABLE ELECTRONIC DEVICE - A portable electronic device has a display part (03-10-2011
20110057875Display control apparatus, display control method, and display control program - A display control apparatus includes a recognizing unit configured to recognize a position of an operator and a position of a hand or the like of the operator, a calculating unit configured to regard a position of the operator in a screen coordinate system set on a screen as an origin of an operator coordinate system and multiply a position of the hand or the like with respect to the origin of the operator coordinate system by a predetermined function, thereby calculating a position of display information corresponding to the hand or the like in the screen coordinate system, and a control unit configured to cause the display information to be displayed at the position in the screen coordinate system calculated by the calculating unit.03-10-2011
20110057873FLEXIBLE ELECTRONIC DEVICE AND METHOD FOR THE CONTROL THEREOFF - The present invention relates to a flexible electronic device (03-10-2011
20100253618DEVICE AND METHOD FOR DISPLAYING AN IMAGE - A projector producing an imaginary input plane with high operability is provided. A projector according to an embodiment projects a VUI screen picture onto a desk, and projects a main projection screen picture to a wall. The projector includes a light receiving element. The light receiving element is arranged in a position where light emitted toward the desk (VUI screen picture) and reflected (or scattered) by an object near the desk enters. The projector calculates a position of the object based on light sensing timing by the light receiving element and light scan positions at various points in time. The projector changes a projected screen picture when it determines that object is simultaneously in contact with a plurality of portions of the VUI screen picture and at least one of contact positions moves.10-07-2010
20100141575VIDEO DISPLAY DEVICE - A video display device enabling first and second viewers to view two respective different screens and enabling the first viewer to easily view the content displayed to the second viewer in a two-screen display mode. The video display device (06-10-2010
20100141574METHOD AND APPARATUS FOR OPERATING MOBILE TERMINAL - A method for operating a mobile terminal is disclosed. The mobile terminal includes a pressure sensor and an orientation sensor. While a pressure event is detected by the pressure sensor, functions related to content classification, content storage, content display, and menu navigation can be executed in response to a direction event detected by the orientation sensor. Hence, the mobile terminal is capable of operating in a dynamic and a flexible manner.06-10-2010
20120194417INPUT DEVICE WITH SWING OPERATION - An input device with swing operation includes a supporting frame, a flexible printed circuit installed on the supporting frame for outputting a signal, a supporting base fixed on the supporting frame, a cap pivoted to the supporting base, and a hook respectively pivoted to the supporting base and the cap. An inclined angle is formed between the hook and the supporting frame when the cap is not pressed down. The hook and the cap pivots relative to the supporting base when the cap is pressed down. The input device further includes a resilient component disposed between the flexible printed circuit and the cap for being pressed by the cap to actuate the flexible printed circuit when the cap is pressed down.08-02-2012
20100013761Systems And Methods For Shifting Haptic Feedback Function Between Passive And Active Modes - Systems and methods for shifting haptic feedback function between passive and active modes are disclosed. For example, one disclosed method includes receiving a first signal from a sensor, the first signal associated with a mode of interaction with a graphical user interface; receiving a second signal associated with an interaction with the graphical user interface; determining a haptic feedback effect based at least in part on the mode of interaction with the graphical user interface and the interaction with the graphical user interface; and generating a haptic signal configured to output the haptic feedback effect.01-21-2010
20100220055MOBILE ANTENNA UNIT AND ACCOMPANYING COMMUNICATION APPARATUS - An antenna unit is provided with an inverted F-type antenna element provided with a feeding point and a ground connection point, and a non-feed antenna element configured so as to resonate with the inverted F-type antenna element through electrical coupling. In addition, the antenna unit may also be provided with a ground part which is grounded to the earth and connected to the ground connection point provided on one edge of the inverted F-type antenna element, and a resonance element, one edge of which is connected to the ground part, resonated by the non-feed antenna element through electrical coupling.09-02-2010
20130127706METHOD FOR UNLOCKING SCREEN - A method for unlocking screen operable in an electronic apparatus having a screen is provided. The method includes, in a state that a display function of the screen is turned off, determining whether a trigger instruction received by the electronic apparatus corresponds to a trigger condition. If the trigger instruction corresponds to the trigger condition, in the state that the display function of the screen is turned off, determining whether an input instruction received by the electronic apparatus corresponds to an unlocking condition. If the input instruction corresponds to the unlocking condition, controlling the screen to turn on the display function thereof.05-23-2013
20100220053INPUT APPARATUS FOR IN-VEHICLE DEVICES - An input apparatus for in-vehicle devices is easy to use and facilitates recognizing the position of fingertips. For this, the input apparatus includes a control unit including a recess allowing fingers to be inserted threreinto and having a control surface on an inner side wall thereof, a control switch disposed on the control surface, and a camera horizontally photographing the fingers inserted into the recess; and a display unit displaying an image of the fingers photographed by the camera to overlay on a control screen.09-02-2010
20090040175INPUT INTERFACE DEVICE WITH TRANSFORMABLE FORM FACTOR - Various implementations of an interface device, along with associated methods and systems, are described in which the interface device has a housing with a transformable form factor, and a transformation assembly that can change the form factor of the housing. At least one of the form factors of the housing has a shape that corresponds to data associated with the interface device.02-12-2009
20090160761METHOD AND HANDHELD ELECTRONIC DEVICE INCLUDING FIRST INPUT COMPONENT AND SECOND TOUCH SENSITIVE INPUT COMPONENT - A handheld electronic device includes a housing having a surface; a first input component having input members disposed external to the surface; a second touch sensitive input component disposed about the input members, the touch sensitive input component being separate and distinct from the input members and the first input component and being structured to provide one of: a contact point with respect to the surface responsive to actuation of a first number of the input members, and a number of responses responsive to actuation of a second number of the input members. A processor cooperates with the first input component and the touch sensitive input component to determine if a plurality of the input members are actuated contemporaneously and to output a representation of a single one of the input members based upon one of: the contact point, and the number of responses.06-25-2009
20090073113Presenter model - A presenter model includes a presenter having a wireless signal transmitting unit and an accommodating box having a wireless signal receiving unit. The accommodating box having the wireless signal receiving unit receives the wireless signals transmitted by the wireless signal transmitting unit of the presenter. Via a USB port connected with an electronic apparatus, the command received is transmitted to the electronic apparatus, thereby achieving the function of transmitting the signals of the presenter. The accommodating box can be provided thereon with at least one insertion slot for a USB interface and at least one insertion slot for an IEEE1394 interface, thereby serving as digital expansion slots. The accommodating box allows the presenter to be disposed therein and combined therewith, thereby increasing the convenience in carrying and storing the presenter.03-19-2009
20090073114Control of a scrollable context menu - Disclosed are a method, a system and a navigation device for generating and controlling an interaction object, which is preferably in the form of a context menu, on a display unit. In at least one embodiment, the method includes presentation of the interaction object by way of at least one presentation signal from the navigation device and selection of at least one functional element from the presented interaction object by way of at least one selection signal from the navigation device, wherein the selection can be made independently of a movement by the navigation device and wherein the at least one functional element to be selected and/or the selected at least one functional element is presented at a constant location on the display unit by moving within the interaction object or by moving the interaction object.03-19-2009
20090184923Haptic Stylus Utilizing An Electroactive Polymer - Haptic feedback interface devices using electroactive polymer (EAP) actuators to provide haptic sensations. A haptic feedback interface device is in communication with a host computer and includes a sensor device that detects the manipulation of the interface device by the user and an electroactive polymer actuator responsive to input signals and operative to output a force to the user caused by motion of the actuator. The output force provides a haptic sensation to the user. In an embodiment, a stylus including a body having a first end and a second end opposite from the first end, a moveable member coupled to the body and capable of being in contact with a user's hand; and an electro active polymer actuator coupled to the moveable member, wherein the electroactive polymer moves the moveable member from a first position to a second position with respect to the body upon being activated.07-23-2009
20090184920Two element slider with guard sensor - A method for using a slider-based capacitive sensor to implement a user interface having discrete buttons. Button locations are designated on a slider-based capacitive sensor having at least two conductive traces such that a user input at any button location results in a capacitance change in the conductive traces. Locations of inputs are distinguishable by ratios between the capacitance changes of the conductive traces, which can be correlated to a particular button location. Ratio ranges corresponding to areas covered by each button are used to identify which button has received an input.07-23-2009
20120242569DISPLAY - A display capable of performing optimum stereoscopic display according to a view position is provided. A display includes: a display section including a plurality of first pixels to a plurality of nth pixels, where n is an integer of 4 or more, and displaying a plurality of perspective images assigned to the first to nth pixels; a detection section detecting a view position of a viewer; and a display control section varying the number of the plurality of perspective images assigned to the first to nth pixels and varying a correspondence relationship between the first to nth pixels and the perspective images, according to the view position of the viewer.09-27-2012
20120242571Data Manipulation Transmission Apparatus, Data Manipulation Transmission Method, and Data Manipulation Transmission Program - A data manipulation transmission apparatus including: an object acquisition section for acquiring a display object, generated on display screen 09-27-2012
20120242572SYSTEM AND METHOD FOR TRANSACTION OF SENSORY INFORMATION - A system and method for transaction of sensory information are provided. A sensory effect extraction apparatus of the system includes a sensory effect extraction unit to extract a sensory effect from an image content in accordance with a sensory effect extraction signal, and a sensory information transmission unit to transmit sensory information based on the extracted sensory effect.09-27-2012
20120242570DEVICE, HEAD MOUNTED DISPLAY, CONTROL METHOD OF DEVICE AND CONTROL METHOD OF HEAD MOUNTED DISPLAY - A device includes a detection unit that detects states of eyelids of a user, and a control unit that performs operations in response to the states of the eyelids of the user detected by the detection unit.09-27-2012
20120242567HAND-HELD DISPLAYING DEVICE - The hand-held displaying device comprises a displaying panel to display at least one color and its brightness, a motion sensor to detect motions of the motion sensor and a controller to control contents of displays in the displaying panel according to motions detected by the motion sensor. The displaying panel may include a plurality of displaying regions, each being able to display at least one color and its brightness and respectively controlled by the controller. The displaying device may provide a sound generating element and a loudspeaker. The controller may provide a memory space to record a series of displayed to be played back in a later time.09-27-2012
201202425683-Dimensional Displaying Apparatus And Driving Method Thereof - A 3-dimensional displaying apparatus includes an image displaying panel having a plurality of pixels and a backlight panel spaced apart from one surface of the image displaying panel. The backlight panel includes a first line source set having a plurality of line sources arranged at regular intervals and a second line source set having line sources arranged spaced apart from the respective line sources of the first line source set by a predetermined interval. The first line source set and the second line source set are driven alternately. Thus, in a case where a horizontal location of an observer varies, the change of brightness of image information and the crosstalk between adjacent visual fields are minimized, and pseudo-stereoscopic vision is prevented. Also, the irregularity of brightness distribution in a visual field may be solved09-27-2012
20120242566Vision-Based User Interface and Related Method - A vision-based user interface includes an image input unit for capturing frame images, an image processor for recognizing a posture in at least one of the captured frame images, and generating a recognized gesture according to the posture, and a control unit for generating a control command corresponding to the recognized gesture.09-27-2012
20090160760MULTI-FUNCTIONAL PRESENTER - A multi-functional presenter includes a case, a control unit in the case, an operating unit on the case and connected to the control unit, a pointer unit connected to the control unit, a bidirectional wireless communication unit connected to the control unit and linked with an external electronic device, a voice receiving unit connected to the control unit for receiving voice signals, a reminding unit connected to the control unit for receiving a warning signal from the external electronic device and giving a reminder to a user, and a power supply unit connected to the control unit. With the voice receiving unit and the bidirectional wireless communication unit, the multi-functional presenter performs not only the functions of paging up/down, pointing images, giving the user a reminder, etc., but also enables on-site voice recording and storing the recorded voice signals on the external electronic device.06-25-2009
20110234486Switching device and switching methods of the same - A switching device that selectively changes a computer to be operated from multiple computers including a control unit that the control unit detects a cursor position on the computer to be operated based on coordinate data and a computer resolution of the computer to be operated, the coordinate data being generated by performing a same acceleration process as the computer to be operated, on relative coordinate data that has been acquired from a given pointing device, and the control unit selectively changing changes the computer to be operated according to the cursor position. It is thus possible to selectively change the computer to be operated without any dedicated software or requiring a given space for manipulation.09-29-2011
20090115721Gesture Recognition Light and Video Image Projector - A system and method is provided for a gesture recognition interface system. The system comprises a projector configured to project colorless light and visible images onto a background surface. The projection of the colorless light can be interleaved with the projection of the visible images. The system also comprises at least one camera configured to receive a plurality of images based on a reflected light contrast difference between the background surface and a sensorless input object during projection of the colorless light. The system further comprises a controller configured to determine a given input gesture based on changes in relative locations of the sensorless input object in the plurality of images, and being further configured to initiate a device input associated with the given input gesture.05-07-2009
20110032181DISPLAY DEVICE AND CONTROL METHOD UTILIZING THE SAME - A display device controlled by an indicator device having a first light-spot is disclosed. The display device includes a first camera, a panel, and a processing unit. The first camera detects the first light-spot for generating a first detection signal. The panel displays a cursor. The processing unit controls the cursor according to the first detection signal. When the distance between the first camera and the first light-spot is a first length and the moving distance of the first light-spot is a first distance, the moving distance of the cursor is a second distance. When the distance between the first camera and the first light-spot is a second length and the moving distance of the first light-spot is the first distance, the moving distance of the cursor is the second distance.02-10-2011
20110032183METHOD, SYSTEM, AND STORAGE MEDIUM FOR A COMIC BOOK READER PLATFORM - The invention pertains to a method, system, and storage medium for a comic book reader platform that retains the unique and highly sought after look and flow of traditional printed comic books in a mobile hand-held device. The invention is capable of displaying the comic book in a full page or cell-by-cell configuration depending on the orientation and/or desire of the particular user and leverages advanced features provided by today's mobile hand-held devices such as motion sensitivity, orientation changing, touch screens, etc. The invention also provides the ability to store multiple comic books and allows the user to switch between them with ease.02-10-2011
20120032879Method, Apparatus, and Article for Force Feedback Based on Tension Control and Tracking Through Cables - A haptic device for human/computer interface includes a user interface tool coupled via cables to first, second, third, and fourth cable control units, each positioned at a vertex of a tetrahedron. Each of the cable control units includes a spool and an encoder configured to provide a signal corresponding to rotation of the respective spool. The cables are wound onto the spool of a respective one of the cable control units. The encoders provide signals corresponding to rotation of the respective spools to track the length of each cable. As the cables wind onto the spools, variations in spool diameter are compensated for. The absolute length of each cable is determined during initialization by retracting each cable In turn to a zero length position. A sensor array coupled to the tool detects rotation around one or more axes.02-09-2012
20090109174Method and Apparatus for User Interface in Electronic Devices With Visual Display Units - A system for a 3-D user interface comprises: one or more 3-D projectors configured to display an image of all or one or more parts of a first electronic device in a 3-D coordinate system; one or more sensors configured to sense user interaction with the image and to provide user interaction information; and a processor configured (i) to receive the user interaction information from the one or more sensors; (ii) to correlate the user interaction with the image; and (iii) to operate a second electronic device in a manner responsive to a correlation of the user interaction with the image, wherein the first electronic device has a visual display unit and the image comprises an image of all or one or more parts of the visual display unit of the first electronic device. A method for providing a 3-D user interface comprises: generating an image of all or one or more parts of a first electronic device in a 3-D coordinate system; sensing user interaction with the image; correlating the user interaction with the image; and operating a second electronic device in a manner responsive to a correlation of the user interaction with the image, wherein the first electronic device has a visual display unit and the image comprises an image of all or one or more parts of the visual display unit of the first electronic device. Computer readable program codes related to the system and the method of the present invention are also described herein.04-30-2009
20090109175METHOD AND APPARATUS FOR USER INTERFACE OF INPUT DEVICES - A system for a 3 dimensional (3-D) user interface comprises: one or more 3-D projectors configured to display an image at a first location in a 3-D coordinate system; one or more sensors configured to sense user interaction with the image and to provide user interaction information; and a processor configured (i) to receive the user interaction information from the one or more sensors; (ii) to correlate the user interaction with the image; and (iii) to provide one or more indications responsive to a correlation of the user interaction with the image, wherein the one or more indications comprise displaying the image at a second location in the 3-D coordinate system. A method for providing a 3-D user interface comprises: generating an image at a first location in a 3-D coordinate system; sensing user interaction with the image; correlating the user interaction with the image; and providing one or more indications responsive to a correlation of the user interaction with the image, wherein the one or more indications comprise displaying the image at a second location in the 3-D coordinate system. Computer readable program codes related to the system and the method of the present invention are also described herein.04-30-2009
20100295772Electronic Device with Sensing Assembly and Method for Detecting Gestures of Geometric Shapes - A method for detecting a gesture in a geometric shape and controlling an electronic device includes providing a sensing assembly including at least one photoreceiver and a plurality of phototransmitters, wherein each phototransmitter emits infrared light away from the electronic device about a corresponding central transmission axis, wherein each central transmission axis is oriented in a different direction with respect to the others; and controlling the emission of infrared light by each of the phototransmitters during each of a plurality of time periods during movement of an external object in a geometric shape relative to the electronic device. For each of the plurality of phototransmitters and for each of the plurality of sequential time periods, a corresponding measured signal is generated which is indicative of a respective amount of infrared light which originated from that phototransmitter during that time period and was reflected by the external object prior to being received by the photoreceiver. The measured signals are evaluated over time to identify the geometric shape; and the electronic device is controlled in response to the identification of the geometric shape.11-25-2010
20100295773ELECTRONIC DEVICE WITH SENSING ASSEMBLY AND METHOD FOR INTERPRETING OFFSET GESTURES - A method for controlling an electronic device includes providing as part of the electronic device a display screen for displaying content and a sensing assembly including at least one photoreceiver and a plurality of phototransmitters, wherein each phototransmitter is positioned to emit infrared light away from the electronic device about a corresponding central transmission axis, wherein each central transmission axis is oriented in a different direction with respect to the others. Emission of infrared light by each of the phototransmitters is controlled during each of a plurality of time periods as an external object moves in a first specified pattern of movement and then moves in a second specified pattern of movement which is offset from a generally centered position with respect to the sensing assembly, and measured signals are generated. The measured signals are evaluated to identify the first specified pattern of movement of the object, to detect a reference offset location corresponding to an end of the first specified pattern of movement of the object, and to determine, for each of a group of time periods when the object is moving in the second specified pattern of movement, a corresponding location of the object during that time period. A centering operation is performed in response to the identification of the first specified pattern of movement, wherein the centering operation moves an indicator to an initial predetermined reference location on the display screen, wherein the predetermined reference location is then associated with the reference offset location; and sequential locations of the indicator on the display screen are controlled in accordance with the corresponding determined locations of the object relative to the reference offset location.11-25-2010
20100295775INPUT SYSTEM AND METHOD FOR ELECTRONIC DEVICE - An input system and method for enhancing an input interface of an electronic device includes a stylus with a signal transmitting module that sends an interrupt signal to an interrupt module of the electronic device when triggered. The interrupt module identifies the interrupt signal and relays the interrupt signal accordingly in order to perform operations of the electronic device.11-25-2010
20100295774Method for Automatic Mapping of Eye Tracker Data to Hypermedia Content - A system for automatic mapping of eye-gaze data to hypermedia content utilizes high-level content-of-interest tags to identify regions of content-of-interest in hypermedia pages. User's computers are equipped with eye-gaze tracker equipment that is capable of determining the user's point-of-gaze on a displayed hypermedia page. A content tracker identifies the location of the content using the content-of-interest tags and a point-of-gaze to content-of-interest linker directly maps the user's point-of-gaze to the displayed content-of-interest. A visible-browser-identifier determines which browser window is being displayed and identifies which portions of the page are being displayed. Test data from plural users viewing test pages is collected, analyzed and reported.11-25-2010
20110115697USER INFORMATION PROVISION DEVICE, USER INFORMATION PRESENTATION SYSTEM, AND USER INFORMATION PRESENTATION METHOD - There are included: an information provision apparatus to which user's position information is input; and a display device that displays information on the user, the information being output from the information provision apparatus. The information provision apparatus includes: a user information storing unit that stores the information on the user; a display device information storing unit that stores information on an intended display range of the display device; and a display control unit that transmits the information on the user stored in the user information storing unit to the display device when a position of the user falls within the intended display range of the display device.05-19-2011
20100302142SYSTEM AND METHOD FOR TRACKING AND ASSESSING MOVEMENT SKILLS IN MULTIDIMENSIONAL SPACE - Accurate simulation of sport to quantify and train performance constructs by employing sensing electronics for determining, in essentially real time, the player's three dimensional positional changes in three or more degrees of freedom (three dimensions); and computer controlled sport specific cuing that evokes or prompts sport specific responses from the player that are measured to provide meaningful indicia of performance. The sport specific cuing is characterized as a virtual opponent that is responsive to, and interactive with, the player in real time. The virtual opponent continually delivers and/or responds to stimuli to create realistic movement challenges for the player.12-02-2010
20100188331METHODS AND APPARATUSES FOR OPERATING A PORTABLE DEVICE BASED ON AN ACCELEROMETER - Methods and apparatuses for operating a portable device based on an accelerometer are described. According to one embodiment of the invention, a movement of a portable device is detected using an accelerometer attached to the portable device. An orientation of the portable device after the movement is determined based on movement data provided by the accelerometer. It is determined whether the portable device is held by a user after the movement based on the movement data provided by the accelerometer. Locations of the hands of the user for holding the portable device are determined based on the orientation of the portable device. At least one interface that is not within the predicted locations of the hands of the user is activated.07-29-2010
20100207879Integrated Proximity Sensor and Light Sensor - Apparatuses and methods to sense proximity and to detect light. In one embodiment, an apparatus includes an emitter of electromagnetic radiation and a detector of electromagnetic radiation; the detector has a sensor to detect electromagnetic radiation from the emitter when sensing proximity, and to detect electromagnetic radiation from a source other than the emitter when sensing visible light. The emitter may be disabled at least temporarily to allow the detector to detect electromagnetic radiation from a source other than the emitter, such as ambient light. In one implementation, the ambient light is measured by measuring infrared wavelengths. Also, a fence having a non-IR transmissive material disposed between the emitter and the detector to remove electromagnetic radiation emitted by the emitter. Other apparatuses and methods and data processing systems and machine readable media are also described.08-19-2010
20100134411Information processing apparatus and information processing method - An information processing apparatus is provided which includes an image send unit for displaying a 3D image of a remote operation device as a virtual remote controller, at least one imaging unit for taking an image of a user, a 3D image detection unit for detecting a user's motion based on a video taken by the imaging unit, an instruction detection unit for determining whether the user has pressed a predetermined operation button arranged on the virtual remote controller, based on a detection result by the 3D image detection unit and a position of the predetermined operation button arranged on the virtual remote controller displayed by the image send unit, and an instruction execution unit for performing a predetermined processing corresponding to the user-pressed operation button on the virtual remote controller based on a determination result by the instruction detection unit.06-03-2010
20100134410IMAGE DISPLAY DEVICE - An image display device easily displays a stereoscopically two-dimensional image and improves its direction effect and interactivity. A display device is included of a display element for displaying an image on a screen, an image transmission element that is set in a light path for a display light component of the image and that transmits the display light component of the image so that a real image of the image is displayed on an image forming surface positioned at a space on a side opposite to the screen as a stray image. The display device includes a property specifying element for specifying a property of a detected object positioned in a real space portion including the space where the stray image is displayed and a control element for controlling the display element so the stray image changes into a form corresponding to the specified property of the object in advance.06-03-2010
20100188326Ornamental thumb or finger ring with secured hidden contact interface input device - An ornamental thumb or finger secured contact interface input device includes a thumb or finger ring that has a rotatable stylus operatively attached to the ring. The stylus includes an elongated retractable interface contact member including a text tap portion for contacting an interface and entering data. The contact member is retractable into a rotabable housing in a hidden ornamental mode and is fully extendable in an interface engaging mode for entering input into an electronic interface device. The ring is non-continuous and includes an opening and is bendable so that may be appropriately sized to the thumb or finger of a user. The device can function as a stylus for inputting an interface in the interface engaging mode or be worn as an item of jewelry in the hidden ornamental mode.07-29-2010
20130135190ELECTRONIC DEVICE, STORAGE MEDIUM, AND METHOD FOR EXECUTING MULTIPLE FUNCTIONS OF PHYSICAL BUTTON OF THE ELECTRONIC DEVICE - In a method for executing multiple functions of a physical button of an electronic device, multiple functions of the physical button are predefined, and a relationship between each of the multiple functions and each placement state of the electronic device is predefined. One of the placement states of the electronic device is detected when the physical button is pressed. A function of the physical button corresponding to the detected placement state of the electronic device is determined according to the predefined relationship. The electronic device is controlled to execute the determined function of the physical button.05-30-2013
20100302140Operation Device - Provided is an operation device to be held by a user with one hand when used. The operation device includes: a recessed portion formed at a position at which at least one of a thumb and fingers is placed when the user holds the operation device; and a main button which is disposed at a bottom of the recessed portion and has a top surface adjacent to a rim portion forming a side surface of the recessed portion.12-02-2010
20110128218INTERACTIVE INPUT SYSTEM AND BEZEL THEREFOR - An interactive input system comprises at least one imaging device having a field of view looking into a region of interest. At least one radiation source emits radiation into the region of interest. A pliable bezel at least partially surrounds the region of interest. The pliable bezel has a reflective surface in the field of view of said at least one imaging device.06-02-2011
20110128216METHOD AND APPARATUS FOR A USER INTERFACE - In accordance with an example embodiment of the present invention, an apparatus comprises a first body part, a second body part, and at least one hinge coupling said first body part with said second body part, said at least one hinge enabling relative rotational movement of said first body part and said second body part with respect to each other between at least one closed configuration and at least one open configuration, said apparatus having a tablet configuration such that said at least one hinge is retractable into at least one of said first body part and said second body part while in said at least one open configuration.06-02-2011
20100302141Display and Interaction Environment for Mobile Devices - A computing device attached to a full-sized display and one or more user input devices supports a mobile device mating environment. Software modules running on a mobile device may interface with custom firmware or software modules on the computing device to support using the display, keyboard, mouse, and other user input devices of the computing device. The display of the computing device may also be leveraged to display screens and notifications generated by the mobile device. The user input devices of the computing device may be utilized to simplify interaction between the user and the mobile device. This operation may be selected instead of, or in addition to, operating the computing device according to its traditional functions associated with a primary operating system and associated applications of the computing device.12-02-2010
20100302139METHOD FOR USING ACCELEROMETER DETECTED IMAGINED KEY PRESS - A portable apparatus comprising input means arranged to receive user input comprising of one or more taps on the portable apparatus. The apparatus is able to detect the taps on the portable apparatus and to determine at least one location of the taps on the portable apparatus. The apparatus is further able to define an imaginary key on the apparatus on the determined location.12-02-2010
20100302138METHODS AND SYSTEMS FOR DEFINING OR MODIFYING A VISUAL REPRESENTATION - A system may track a user's motions or gestures performed in a physical space and map them to a visual representation of the user. The user's gestures may be translated to a control in a system or application space, such as to open a file or to execute a punch in a punching game. Similarly, the user's gestures may be translated to a control in the system or application space for making modifications to a visual representation. A visual representation may be a display of a virtual object or a display that maps to a target in the physical space. In another example embodiment, the system may track the target in the physical space over time and apply modifications or updates to the visual representation based on the history data.12-02-2010
20100302137Touch Sensitive Display Apparatus using sensor input - Described herein is a system that includes a receiver component that receives gesture data from a sensor unit that is coupled to a body of a gloveless user, wherein the gesture data is indicative of a bodily gesture of the user, wherein the bodily gesture comprises movement pertaining to at least one limb of the gloveless user. The system further includes a location determiner component that determines location of the bodily gesture with respect to a touch-sensitive display apparatus. The system also includes a display component that causes the touch-sensitive display apparatus to display an image based at least in part upon the received gesture data and the determined location of the bodily gesture with respect to the touch-sensitive display apparatus.12-02-2010
20110122058Inline control system for therapeutic pad - A controller for use in a therapeutic system having a console disposed in a first housing and a physically separate pad. The controller includes a second housing physically separate from the console and the pad; a processor disposed within the housing and electrically coupled to the console and the pad; a storage medium accessible by the processor and mounted within the second housing; software stored on the storage medium for execution by the processor; a switch coupled to the processor; and a display coupled to the processor. In the illustrative embodiment, the invention further includes a second processor disposed within the housing and electrically coupled to the console and the pad. In a specific implementation, the controller includes software for applying stimulation current to the pad and for regulating heat current applied to the pad. The software includes code for sensing temperature from the pad and for adjusting current to the pad in response to the sensed temperature at the pad and a reference temperature data from the console. The invention enables a thermostimulation system comprising a console disposed in a first housing; a plurality of thermostimulation pads; and a plurality of the inline controllers electrically coupled between the console and a respective one of the pads.05-26-2011
20130141326GESTURE DETECTING METHOD, GESTURE DETECTING SYSTEM AND COMPUTER READABLE STORAGE MEDIUM - A gesture detecting method includes steps of defining an initial reference point in a screen of an electronic device; dividing the screen into N areas radially according to the initial reference point; when a gesture corresponding object moves in the screen and a trajectory of the gesture corresponding object crosses M of the N areas, selecting a sample point from each of the M areas so as to obtain M sample points; and calculating a center and a radius of the trajectory of the gesture corresponding object according to P of the M sample points so as to determine a circular or curved trajectory input. Accordingly, the invention is capable of providing a center, a radius, a direction and an arc angle corresponding to a gesture in real-time without establishing a gesture model.06-06-2013
20130141328Dynamic Interpretation of User Input in a Portable Electronic Device - The embodiments describe both the interpreting and modifying the interpretation of an input event to an electronic device having limited user input resources. The input event interpretation can be based in part on a connection state of the device. In some cases, the interpretation of the input event can also be based upon an indication of a current operating state of the device in addition to or exclusive of the connection state. Furthermore, in some embodiments, an operating state of the portable electronic device can be resolved based in part on the connection state of the portable electronic device.06-06-2013
20110241981Input Routing for Simultaneous USB Connections of Similar Device Types - Methods and devices for accommodating a plurality of interface devices via a Universal Serial Bus (USB) that include: (a) receiving a set of settings for an interface device; (b) generating an input/output (I/O) device handle associated with the interface device; (c) comparing the received interface device settings set with one or more entries of a device-matching criteria database; and (d) if the received set of interface device settings matches an entry of the device matching criteria database, then: (i) generating a device manager handle associated with the interface device; and (ii) spawning a device manager thread based on the generated I/O device handle.10-06-2011
20110109542IMAGE DISPLAY DEVICE AND METHOD OF CONTROLLING THE SAME - An image display device includes: a first audio signal input terminal to which a first audio signal is input from a microphone; and a control section adapted to control power supply from a power supply circuit to the microphone based on type information indicative of whether a type of the microphone to be connected to the first audio signal input terminal is a first type which is requiring power supply or a second type which is not requiring power supply.05-12-2011
20090066638Association of virtual controls with physical controls - A media application for providing outputs (e.g., audio outputs) in response to inputs received from an input device is provided. The media application may connect input mechanisms of an input device with parameters of channel strips (e.g., which may define output sounds) using an intermediate screen object. The media application may first assign an input mechanism to a screen object, and separately map a screen object to a channel strip parameter. The media application may map a screen object to several channel strips simultaneously such that, based on the value of the screen object, the volume of each of the several channel strips changes. The media application may provide a graphical representation of available channel strips using layers. As the media application accesses a channel strip, the appearance of the portion of the layer associated with the channel strip may change. The media application may also allow the patches, which may include several channel strips, to survive after a new patch is selected instead.03-12-2009
20110018794METHOD AND APPARATUS FOR CONTROLLING MOBILE AND CONSUMER ELECTRONIC DEVICES - Various methods for controlling a device is disclosed including dynamically selecting a set of mappings defining how a gesture made by a movement of at least one wearable item will be interpreted as one or more commands; determining whether the gesture has a mapping in the set of mappings; and translating the gesture into a command for the device based on the determination. Interpreting movements of a wearable item as gestures associated with a command to control a controlled device is also disclosed that includes sensing a movement of the wearable item in context as being indicative of a gesture relating to the command based on the first context. A method for communicating control information by a wearable device is further disclosed including determining an agreed upon set of control gestures between first and second devices, wherein the control gestures are performable using the first device and are supportable by the second device; and participating in a control sequence to control the second device via a wireless transmission corresponding to at least one of the control gestures to be performed using the first device.01-27-2011
20110018795METHOD AND APPARATUS FOR CONTROLLING ELECTRONIC DEVICE USING USER INTERACTION - A method and an apparatus for controlling an electronic device according to a user interaction occurring in a space neighboring the electronic device. The method for controlling an electronic device using an input interaction includes: recognizing at least one interaction occurring in a space neighboring the electronic device; and controlling the electronic device corresponding to the at least one interaction.01-27-2011
20110109543METHOD AND APPARATUS FOR DISPLAYING NAVIGATIONAL VIEWS ON A PORTABLE DEVICE - A portable device (05-12-2011
20110109539BEHAVIOR RECOGNITION SYSTEM AND METHOD BY COMBINING IMAGE AND SPEECH - A behavior recognition system and method by combining an image and a speech are provided. The system includes a data analyzing module, a database, and a calculating module. A plurality of image-and-speech relation modules is stored in the database. Each image-and-speech relation module includes a feature extraction parameter and an image-and-speech relation parameter. The data analyzing module obtains a gesture image and a speech data corresponding to each other, and substitutes the gesture image and the speech data into each feature extraction parameter to generate image feature sequences and speech feature sequences. The data analyzing module uses each image-and-speech relation parameter to calculate image-and-speech status parameters. The calculating module uses the image-and-speech status parameters, the image feature sequences, and the speech feature sequences to calculate a recognition probability corresponding to each image-and-speech relation parameter, so as to take a maximum value among the recognition probabilities as a target parameter.05-12-2011
20110109540ACCELEROMETER-BASED TAPPING USER INTERFACE - A CE device for, e.g., displaying the time can incorporate an accelerometer to provide various features and enhancements. For example, tapping of the housing as sensed by the accelerometer may be used for controlling various application modes of the device.05-12-2011
20110001696MANIPULATING OBJECTS DISPLAYED ON A DISPLAY SCREEN - Embodiments of the present invention is directed toward determining a location where a pointing device is directed. In one embodiment, the method includes receiving a message at the computing device from the pointing device. Sensor data is extracted from the message, the sensor data comprising accelerometer data, gyroscope data, or a combination thereof. A position of the pointing device in three-dimensional space is identified. An orientation of the pointing device in three-dimensional space is identified using the sensor data. A location to which the pointing device is directed is determined by utilizing the identified position of the pointing device and the identified orientation of the pointing device, and an object on a display screen at the location where the pointing device is directed is altered.01-06-2011
20110001695WEARABLE TERMINAL DEVICE AND METHOD OF CONTROLLING THE WEARABLE TERMINAL DEVICE - A wearable terminal device includes: a head mounted display including a monitor display unit; a line-of-sight detecting unit for detecting a line-of-sight position of a wearer; a sound collecting unit configured to collect sound uttered by the wearer; a sound recognizing unit configured to recognize a sound command from the wearer on the basis of the collected sound; an operation unit configured to receive operation corresponding to the detected line-of-sight position or operation instructed by the recognized sound command; a setting unit configured to set an operation mode corresponding to work of the wearer out of plural operation modes; and a control unit configured to control, in the set operation mode, display of the monitor display unit corresponding to operation by the wearer.01-06-2011
20110032184ORTHOPEDIC METHOD AND SYSTEM FOR MAPPING AN ANATOMICAL PIVOT POINT - A system and method of touchless interaction is provided for resolving a pivot point of an object where direct placement of a sensor at the pivot point is not practical. It applies to situations where the pivot point of a rigid object is inaccessible but remains stationary, while the other end is free to move and is accessible. The system maps the object's pivot point by way of an external sensor that detects constrained motion of the rigid object within a hemispherical banded boundary. It can also detect a geometric pattern and acceleration during the constrained motion to compensate for higher order rotations about the pivot point. Other embodiments are disclosed.02-10-2011
20090231273MIRROR FEEDBACK UPON PHYSICAL OBJECT SELECTION - A highlighting method and an interaction system (09-17-2009
20100156783WEARABLE DATA INPUT DEVICE - A wearable data input device. A garment to be worn on a user's hand includes at least one digit portion for receiving a corresponding digit of the user's hand; a palmar portion; and a dorsal portion. A plurality of contact sensors includes at least one sensor on each digit portion of the garment. Each contact sensor is configured to detect contact between a corresponding portion of the user's hand and another object, and to generate contact signals in accordance with the detected contact. A surface movement sensor detects 2-dimensional (2-D) movement of the user's hand across a surface, and generates corresponding 2-D movement signals in accordance with the detected movement. A consol is configured to receive signals from each sensor, and to selectively transmit the received signals to a computer. At least the consol is selectively removable from the garment.06-24-2010
20090207131ACOUSTIC POINTING DEVICE, POINTING METHOD OF SOUND SOURCE POSITION, AND COMPUTER SYSTEM - There is disclosed an acoustic pointing device that is capable of performing pointing manipulation without putting any auxiliary equipment on a desk. The acoustic pointing includes a microphone array that retains plural microphone elements; an A/D converter that converts analog sound pressure data into digital sound pressure data; a buffering that stores the digital sound pressure data; a direction of arrival estimation unit that executes estimation of a sound source direction of a transient sound based on a correlation of the sound between the microphone elements obtained by the digital sound pressure data; a noise estimation unit that estimates a noise level in the digital sound pressure data; an SNR estimation unit that estimates a rate of a signal component based on the noise level and the digital sound pressure data; a power calculation unit that computes and outputs an output signal from the rate of a signal component; an integration unit that integrates the sound source direction and the output signal to specify a sound source position; and a control unit that converts, based on data in a DB of screen conversion, the specified sound source position into one point on a screen of a display device.08-20-2009
20110115699Display control system, display control device, and display control method - Provided is a display control device including a display control unit that, when display information of one content among a plurality of contents is selected on a display screen, creates a next display screen containing display information of at least any of a plurality of contents relevant to the one content, wherein the display information contained in the next display screen is display information of contents according to a selection sequence of a plurality of display information having been selected before among the plurality of contents relevant to the one content.05-19-2011
20110115701COMMUNICATION TERMINAL, CONTROL METHOD, AND CONTROL PROGRAM - A communication terminal includes: an output unit including a display; an input unit for inputting first operation information; a communication device for transmitting the first operation information to the other communication terminal via the network and receiving second operation information from the other communication terminal via the network; a first output control unit for causing the display to output first information, based on the first operation information, in a first area of the display when the first operation information is input, and for causing the display to output second information, based on the second operation information, in the first area of the display when the second operation information is received; and a comparison unit for comparing a time at which the first operation information is input plus a predetermined period of time with a time at which the second operation information is received, and causing the output unit to output third information based on a result of the comparing.05-19-2011
20100171693DISPLAY CONTROL DEVICE, DISPLAY CONTROL METHOD, AND PROGRAM - A display control device includes a cabinet, a detection unit configured to detect an orientation of the cabinet, a display screen, and a display control unit configured to control display of the display screen in accordance with the orientation of the cabinet detected by the detection unit.07-08-2010
20110115702Process for Providing and Editing Instructions, Data, Data Structures, and Algorithms in a Computer System - A method and system for computer programming using speech and one or two hand gesture input is described. The system generally uses a plurality of microphones and cameras as input devices. A configurable event recognition system is described allowing various software objects in a system to respond to speech and hand gesture and other input. From this input program code is produced that can be compiled at any time. Various speech and hand gesture events invoke functions within programs to modify programs, move text and punctuation in a word processor, manipulate mathematical objects, perform data mining, perform natural language interne search, modify project management tasks and visualizations, perform 05-19-2011
20110115698DISPLAY APPARATUS, TERMINAL, AND IMAGE DISPLAY METHOD - A display apparatus includes a display unit; a communication unit which communicates with a terminal which displays a personal image provided to the terminal, the terminal being one of a plurality of terminals; and a controller controls the display unit to display a sharing image shared among users of the plurality of terminals on the display unit and changes the sharing image displayed on the display unit in accordance with an input received from the terminal. Accordingly, there is provided a display apparatus which provides a video interface including a sharing image and a personal image, a terminal and an image display method.05-19-2011
20110115703INFORMATION DISPLAY SYSTEM - A head-worn information display system which includes a display panel. A display mode of information displayed on the display panel is switched automatically according to an active state of a user using the information display system, so as to display information appropriate for the active state of the user.05-19-2011
20110115700System for Displaying Images - A system for displaying images includes a transflective display panel and a light source module oppositely disposed thereto. The light source module includes a light guide plate, a plurality of first light-emitting diodes (LEDs), a plurality of second LEDs, and a lighting control unit electrically connected to the pluralities of first and second LEDs. The light guide plate includes a first portion and a second portion corresponding to a first display region and a second display region of the transflective display panel, respectively. Each first LED is a white light-emitting diode and transmits an emitted light therefrom to the first display region by the first portion of the light guide plate. The plurality of second LEDs includes red, green, and blue LEDs and transmits an emitted light therefrom to the second display region by the second portion of the light guide plate.05-19-2011
20130154914Casing - A casing, method and apparatus wherein the casing includes at least one user deformable portion; at least one sensor configured to detect deformation of the at least one user deformable portion; wherein the casing is configured to be removably coupled to an apparatus and is configured so that, in response to detecting the deformation of the user deformable portion of the casing, a control signal for control of the apparatus is provided.06-20-2013
20130154917PAIRING A COMPUTING DEVICE TO A USER - A method for automatically pairing an input device to a user is provided herein. According to one embodiment, the method includes receiving an input from an unpaired input device within an observed scene, and calculating a position of the unpaired input device upon receiving the input. The method further includes detecting one or more users within the observed scene via a capture device, creating a candidate list of the one or more detected users determined to be within a vicinity of the unpaired input device, and assigning one detected user on the candidate list to the unpaired input device to initiate pairing.06-20-2013
20090033618Unit, an Assembly and a Method for Controlling in a Dynamic Egocentric Interactive Space - A portable unit for providing instructions for navigation in menus or controlling equipment, the unit having a user interface and a camera pointing in the general direction of the user. The unit tracking relative movements between the unit and the user and converting the relative movement into the instructions. The unit may be used as a remote control for audio or video equipment or computers or the like.02-05-2009
20090033619METHOD AND APPARATUS FOR CONTROLLING UNIVERSAL PLUG AND PLAY DEVICE TO REPRODUCE CONTENT IN A PLURALITY OF REPRODUCTION REGIONS ON SCREEN THEREOF - Provided are a method of reproducing content using a universal plug and play (UPnP) device which has a plurality of reproduction regions. According to the method, information regarding reproduction regions is obtained using newly defined actions before a control point calls an action “Play( ).” Then, a reproduction region for a piece of content is designated according to a user's input. Accordingly, the user can simultaneously enjoy a plurality of pieces of content in desired reproduction regions on a screen of a UPnP device.02-05-2009
20100164864DIRECTION CONTROLLING SYSTEM AND METHOD OF AN ELECTRONIC DEVICE - A direction controlling system and method of an electronic device provides a fingerprint identification device for a user to touch. The electronic device captures a fingerprint template image of a finger. When the finger moves on the fingerprint identification device, the electronic device captures a sequence of fingerprint images of the finger. Furthermore, the electronic device detects a directional movement of the fingerprint according to the sequence of fingerprint images and the fingerprint template image. A scroll bar of the electronic device is controlled to move according to the movement direction and a movement distance calculated by the electronic device.07-01-2010
20100164863Systems, Software, Apparatus and Methods for Managing Out-of -Home Displays - Systems, methods, apparatus, and software for monitoring and managing out-of-home (“OOH”) displays are provided. In one aspect, a system for monitoring an OOH display includes an OOH display device configure to display content to at least one subject. An interaction detector is configured to detect at least one interaction between the OOH display device and a subject, and provide data about such interaction. An input mechanism accepts input signals from the subject, and a display controller device accepts signals from the subject and OOH display device. A data processing and routing mechanism processes and exchanges the data.07-01-2010
20100164862Visual and Physical Motion Sensing for Three-Dimensional Motion Capture - A system includes a visual data collector for collecting visual information from an image of one or more features of an object. The system also includes a physical data collector for collecting sensor information provided by at one or more sensors attached to the object. The system also includes a computer system that includes a motion data combiner for combining the visual information the sensor information. The motion data combiner is configured to determine the position of a representation of one or more of the feature in a virtual representation of the object from the combined visual information and sensor information. Various types of virtual representations may be provided from the combined information, for example, one or more poses (e.g., position and orientation) of the object may be represented.07-01-2010
20100164865INFORMATION PROVIDING SYSTEM AND INFORMATION PROVIDING METHOD - An information providing system includes a display unit installed in a table and having on the table, a display screen that displays an image of a product; a detecting unit that detects placement of an object at the position of the displayed image; and a display control unit that causes content related to the product to be displayed on the display screen when placement of the object at the position of the image is detected by the detecting unit.07-01-2010
20110128217DISPLAY SYSTEM CONTROLLING DISPLAY DATA USING REMOTE CONTROLLER HAVING LANGUAGE CONVERSION KEY - Disclosed herein is a display system controlling display of data on a display according to input of a user through a remote controller. The display system includes the remote controller, a language conversion key for converting the displayed fundamental alphabets into modified alphabets, and a transmitter; and the display including a receiver, an LED module, and a controller for analyzing the control signal received by the receiver, selecting one of modified alphabets belonging to a modified alphabet group, which corresponds to a fundamental alphabet being expressed by the LED module, according to input through the language conversion key and controlling display of the selected modified alphabet through the LED module. Accordingly, a user can easily change a displayed language through the language conversion key.06-02-2011
20090244001BACK PLATE, DISPLAY, DISPLAY SYSTEM, METHOD OF SUPPLYING AN ELECTRIC POWER, AND DISPLAY METHOD - According to one embodiment, a back plate includes: a base plate including a principal surface on which a plurality of power supplying portions, a plurality of image signal transmitting portions, and a position detecting portion are disposed, each of the position detecting portions being configured to detect a position of a display device having a position marker; and a controller including: a detector configured to detect position information and attitude information of the display device, a selector configured to select at least one of power supplying portions and at least one of image signal transmitting portions, a power supply controller configured to supply an electric power to the selected power supplying portion, an image signal generator configured to produce image signal, and an image signal supply controller configured to supply the produced image signal to the selected image signal transmitting portion.10-01-2009
20090243998APPARATUS, METHOD AND COMPUTER PROGRAM PRODUCT FOR PROVIDING AN INPUT GESTURE INDICATOR - An apparatus, method and computer program product are provided for providing an input gesture indicator. Upon detecting one or more tactile inputs, an electronic device may determine one or more characteristics associated with the tactile input(s) (e.g., number, force, hand pose, finger identity). In addition, the electronic device may receive contextual information associated with the current state of the electronic device (e.g., current application operating on the device). Using the characteristic(s) determined and the contextual information received, the electronic device may predict which operations the user is likely to request, or commands the user is likely to perform, by way of a finger gesture. Once a prediction has been made, the electronic device may display an indicator that illustrates the gesture associated with the predicted operation(s). The user may use the indicator as a reference to perform the finger gesture necessary to perform the corresponding command.10-01-2009
20110210913DISPLAY AND WRITING DEVICE - A display and writing device is disclosed that comprises a flexible interface module for providing a writing surface and displaying a captured content; a scanner for scanning the writing surface and producing the captured content accordingly; at least two rollers for driving the flexible interface module to pass the scanner; and an operating module used to operate the flexible interface module, the scanner and the rollers.09-01-2011
20110241983DISPLAY DEVICE WITH PRIVACY FUNCTION, AND METHOD FOR PROVIDING PRIVACY TO A DISPLAY DEVICE - A method is adapted for providing privacy to a display device. The display device is adapted to display a normal image that can be viewed in a user area within which a user is situated. The method includes the steps of: a) configuring the display device to generate information of an interference image; b) configuring a display panel of the display device to display the normal image and the interference image; and c) configuring the display device to direct the interference image toward areas outside the user area. Accordingly, the user situated in the user area is able to view the normal image, while a viewer outside the user area is able to view the interference image, thereby achieving a privacy effect.10-06-2011
20110241982ELECTRONIC DEVICE CAPABLE OF AUTOMATICALLY ADJUSTING FILE DISPLAYED ON DISPLAY AND METHOD THEREOF - An adjusting method for adjusting a file includes determining a position of a first point of focus on the display and recording the content displayed at the position of the first point of focus when the electronic device is in a steady state, determining a position of a second point of focus according to the movement of the electronic device and the position of the first point of focus on the display, and adjusting the content by moving the content displayed on the first point of focus to the position of the second point of focus. An electronic device for implementing the adjusting method is also provided.10-06-2011
20100220054Wearable electrical apparatus - A wearable input device includes a pair of ring-shaped signal electrodes and a current sensor arranged in parallel in the direction of the axis of a finger. The current sensor is provided outside an area sandwiched between the signal electrodes. An alternating current signal is applied between the signal electrodes. When the top end of the finger with this device worn thereon is brought into contact with any other body site, a current flows through the current measure point of the current sensor. When the top end of the finger is not in contact with any other body site, no current flows through the measurement point of the current sensor. Based on the measured current, it is determined whether the finger is in contact with any other body site. A command is outputted to an external device according to the result of the determination.09-02-2010
20090219246TERMINAL DEVICE, TERMINAL SYSTEM AND COMPUTER-READABLE RECORDING MEDIUM RECORDING PROGRAM - A terminal device displays an image such that an operation part is positioned on a right side of the terminal device when image information is first-kind information indicating that an operation part is on a right side or a left side and a right-hand operation mode in which the operation part is operated by a user's right hand is set. A terminal device displays an image such that an operation part is positioned on a left side of the terminal device when image information is the first-kind information and a left-hand operation mode in which the operation part is operated by a user's left hand is set. A terminal device displays an image such that an operation part is positioned on a user's side of the terminal device when image information is second-kind information indicating that the operation part is on a deep side of the terminal device.09-03-2009
20100177037APPARATUS AND METHOD FOR MOTION DETECTION IN A PORTABLE TERMINAL - An apparatus and method for motion detection in a portable terminal preferably includes a state determination unit for receiving first sensing information for determining a motion of the portable terminal by a sensor unit. After determining the motion of the portable terminal, second sensing information is received for determining whether or not a normal motion applying a motion function has occurred. The portable terminal determines if the portable terminal is a normal motion associated with a motion-related function or an abnormal motion that is not associated with a motion-related function.07-15-2010
20090207130INPUT DEVICE AND INPUT METHOD - The present invention discloses an input device and an input method. The input device comprises: a device for receiving input signals; and a processor circuit for generating control information according to comparison between a first difference between two input signals in a first direction and a second difference between two input signals in a second direction.08-20-2009
20090244000USER INTERFACE FOR INTEGRATING DIVERSE METHODS OF COMMUNICATION - An integrated communication interface is provided for composing and sending messages. The interface is multi-configurable to seamlessly switch between different communication methods, e.g., electronic mail, instant messaging, SMS, chat, voice, and the like, without loss of message content. The interface allows a user to begin composing a message to be sent using one communication method, such as electronic mail, and subsequently change the communication method and send the message via a second communication method, such as instant messaging. When the communication method is changed, the user interface may also change to include elements specific to a particular communication method. The integrated communication interface may display information about participants in the communication, such as the participants' presence, i.e., whether they are online and available for communication, and may automatically choose the best method of communication based on the preferences and online presence of the participants.10-01-2009
20090243997Systems and Methods For Resonance Detection - Systems and methods for resonance detection are disclosed. For example, one method for resonance detection includes the step of transmitting an actuator signal to an actuator coupled to a surface of a device. The actuator signal is configured to cause the actuator to output a force to the surface. The method further includes the steps of receiving a response of the surface to the force; determining a resonant frequency of the surface based at least in part on the response; and outputting a signal indicative of the resonant frequency.10-01-2009
20100039377System and Method for Controlling a Virtual Reality Environment by an Actor in the Virtual Reality Environment - A motion capture environment includes at least one sensor-tracker for tracking a location of a tracked object within the motion capture environment and one or more computers collectively operable to generate a virtual reality environment including a virtual control panel having a virtual control that, when actuated, effects a predetermined result in the virtual reality environment; determine a virtual location of the tracked object within the virtual reality environment; and determine when the virtual location of the tracked object coincides with the location of the virtual control to actuate the virtual control. The motion capture environment further includes a display device for displaying the virtual reality environment to an actor within the motion capture environment.02-18-2010
20090322671TOUCH SCREEN AUGMENTED REALITY SYSTEM AND METHOD - An improved augmented reality (AR) system integrates a human interface and computing system into a single, hand-held device. A touch-screen display and a rear-mounted camera allows a user interact the AR content in a more intuitive way. A database storing graphical images or textual information about objects to be augmented. A processor is operative to analyze the imagery from the camera to locate one or more fiducials associated with a real object, determine the pose of the camera based upon the position or orientation of the fiducials, search the database to find Graphical images or textual information associated with the real object, and display graphical images or textual information in overlying registration with the imagery from the camera.12-31-2009
20100060569WIRELESS REMOTE CONTROL HAVING MOTION-BASED CONTROL FUNCTIONS AND METHOD OF MANUFACTURE THEREOF - A wireless remote control and a method of manufacturing the same. In one embodiment, the wireless remote control includes: (03-11-2010
20100039378Information Processing Apparatus, Method and Program - An information processing apparatus includes an imaging unit, an icon display control unit causing a display to display an operation icon, a pickup image display processing unit causing the display to sequentially display an input operation region image constituted by, among pixel regions constituting an image picked up by the imaging unit, a pixel region including at least a portion of a hand of a user, an icon management unit managing event issue definition information, which is a condition for determining that the operation icon has been operated by the user, for each operation icon, an operation determination unit determining whether the user has operated the operation icon based on the input operation region image displayed in the display and the event issue definition information, and a processing execution unit performing predetermined processing corresponding to the operation icon in accordance with a determination result by the operation determination unit.02-18-2010
20100039374ELECTRONIC DEVICE AND METHOD FOR VIEWING DISPLAYABLE MEDIAS - A method adapted for an electronic device for viewing medias is provided. The method includes: detecting a motion of an electronic device; determining the motion of the electronic device is a first two control motion or a second two control motion; controlling a display unit to display a previous media or a next media if the motion of the electronic device is the first two control motion; controlling the display unit to display a media of a previous album or a next album if the motion of the electronic device is the second two control motion.02-18-2010
20100039376SYSTEM AND METHOD FOR REDUCING POWER CONSUMPTION OF A DISPLAY DEVICE - A system and method for reducing power consumption of a display device initializes a counter as zero, controls a video camera to capture an image of an object that is in front of the display device, and determines whether any user's eyes are viewing the display device by analyzing facial features of the captured image. The system and method further controls the display device to work in a normal display mode if any user's eyes are viewing the display device, and increments the counter each second if no user's eyes are viewing the display device. Additionally, the system and method controls the display device to work in a display protection mode if the counter is more than a first predefined threshold number and less than a second predefined threshold number, and in a power reducing mode if the counter is not less than the second predefined threshold number.02-18-2010
20100039373Hybrid Control Of Haptic Feedback For Host Computer And Interface Device - A hybrid haptic feedback system in which a host computer and haptic feedback device share processing loads to various degrees in the output of haptic sensations, and features for efficient output of haptic sensations in such a system. A haptic feedback interface device in communication with a host computer includes a device microcontroller outputting force values to the actuator to control output forces. In various embodiments, the microcontroller can determine force values for one type of force effect while receiving force values computed by the host computer for a different type of force effect. For example, the microcontroller can determine closed loop effect values and receive computed open loop effect values from the host; or the microcontroller can determine high frequency open loop effect values and receive low frequency open loop effect values from the host. Various features allow the host to efficiently stream computed force values to the device.02-18-2010
20100053078INPUT UNIT, MOVEMENT CONTROL SYSTEM AND MOVEMENT CONTROL METHOD USING THE SAME - Exemplary embodiments of the present invention relate to a movement control method and system of a terminal input unit having a plurality of protrusions. Exemplary embodiments of the present invention disclose a process and an apparatus for controlling each protrusion so that the input unit forms interface modes used for a function control according to a user function of the terminal.03-04-2010
20100039372MOBILE DEVICE DISPLAY IMPLEMENTATION - A mobile communication device contains a plurality of displays. A control module and one of the displays is contained by a first housing. Another housing, bearing a second display, is slidably engageable with the first housing. An optical data transmission mechanism is coupled between the control module and the second display. Data generated by the control module can be converted to optical signals and transmitted through the optical transmission path for control of the second display.02-18-2010
20100053074Display control based on bendable display containing electronic device conformation sequence status - A system includes, but is not limited to: one or more conformation sensor modules configured to direct obtaining information associated with one or more sequences of two or more conformations of one or more portions of one or more regions of a bendable display containing electronic device and one or more display control modules configured to direct controlling display of one or more portions of the bendable display containing electronic device regarding display of second information in response to the information associated with the one or more sequences of two or more conformations of the one or more portions of the one or more regions of the bendable display containing electronic device. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure.03-04-2010
20100053071Display control of classified content based on flexible display containing electronic device conformation - A method includes, but is not limited to: obtaining first information associated with one or more conformations of one or more portions of one or more regions of a flexible display containing electronic device and controlling display of one or more portions of the flexible display containing electronic device regarding display of second information having one or more classifications in response to the first information associated with the one or more conformations of the one or more portions of the one or more regions of the flexible display containing electronic device. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure.03-04-2010
20100053075Display control based on bendable interface containing electronic device conformation sequence status - A method includes, but is not limited to: obtaining and controlling display of one or more portions of the bendable interface containing electronic device regarding display of second information in response to the information associated with the one or more changes in one or more sequences of two or more conformations of the one or more portions of the one or more regions of the bendable interface containing electronic device. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure.03-04-2010
20100066665Tablet Computer Equipped with Microphones - A tablet PC capable of providing continuous utilization of a sound signal collected from a microphone without requiring any user intervention when a use mode thereof has been changed from a PC use mode to a tablet use mode is disclosed. The tablet PC includes a set of microphones to form a microphone array. The tablet PC is able to operate in a sound emphasis mode wherein sound signals collected from the microphones are processed while forming an emphasis space, and to operate in a non-processing mode wherein the sound signals are processed without forming the emphasis space. When a user manipulates a chassis orientation of the tablet PC from a PC use mode to a tablet use mode, the tablet PC operates to process the emphasis space so that the sound signals collected by the microphones can be utilized in the tablet use mode.03-18-2010
20100053070MULTI-DIMENSIONAL OPTICAL CONTROL DEVICE AND A CONTROLLING METHOD THEREOF - A multi-dimensional optical control device and a method thereof are provided. A movable light source can be moved due to an external action, and produce a light beam. A lens coupled to the light source is to focus the light beam. A sensor is used to sense a spot formed on the sensor by the focused light beam, and a data processing circuit coupled to the sensor is to obtain variations of position, shape and light intensity in respect to a reference spot. According to such variations of position, shape and light intensity, the data processing circuit performs a motion control of multiple dimensions03-04-2010
20100053077NOTEBOOK COMPUTER WITH FORCE FEEDBACK FOR GAMING - A notebook computer with force feedback for gaming is provided. A vibrating plate is disposed on a host, and a vibration generating device is disposed at the bottom of the vibrating plate to generate vibration when a computer game is executed. Thus, a user playing the computer game may feel the force feedback. Additionally, to prevent vibration from affecting the normal operation of the host, a damper is disposed between the host and the vibrating plate to prevent vibration from being transmitted to the host.03-04-2010
20110084901USER INTERFACE DEVICE FOR CONTROLLING A CONSUMER LOAD AND LIGHT SYSTEM USING SUCH USER INTERFACE DEVICE - The invention relates to a user interface device for controlling an electrical consumer, in particular, a light system (04-14-2011
20110175806ELECTRONIC DEVICE FOR USE IN MOTION DETECTION AND METHOD FOR OBTAINING RESULTANT DEVIATION THEREOF - An electronic device utilizing a nine-axis motion sensor module, capable of accurately outputting a resultant deviation including deviation angles in a 3D reference frame is provided. The present invention provides a novel comparison and compensation to accurately obtain a resultant deviation including deviation angles of the electronic device under the presence of external and/or internal interferences including the ones caused by undesirable electromagnetic fields and the ones associated with undesirable external forces and axial accelerations. The output of the nine-axis motion sensor module of the present invention including a rotation sensor, an accelerometer and a magnetometer can be advantageously obtained and compensated with a comparison comparing different states of the motion sensor module such that an updated state associated with the output and the resultant deviation angles of the nine-axis motion sensor module are preferably obtained in an absolute manner with the undesirable external interferences being effectively excluded.07-21-2011
20110175805MOTION CONTROLLABLE DUAL DISPLAY PORTABLE MEDIA DEVICE - Methods and apparatus of interaction with and control of a portable media device through applied motion. In the embodiments described herein, the portable media device can include at least two displays arranged such that only one can be presented at a time. The portable media device can be configured to operate as a electronic book (e-book) having at least one electrophoretic type display having a refresh time less than an amount of time to rotate the e-book to view the refreshed display.07-21-2011
20110175801Directed Performance In Motion Capture System - Techniques for enhancing the use of a motion capture system are provided. A motion capture system tracks movement and audio inputs from a person in a physical space, and provides the inputs to an application, which displays a virtual space on a display. Bodily movements can be used to define traits of an avatar in the virtual space. The person can be directed to perform the movements by a coaching avatar, or visual or audio cues in the virtual space. The application can respond to the detected movements and voice commands or voice volume of the person to define avatar traits and initiate pre-scripted audio-visual events in the virtual space to provide an entertaining experience. A performance in the virtual space can be captured and played back with automatic modifications, such as alterations to the avatar's voice or appearance, or modifications made by another person.07-21-2011
20110175803SYSTEM AND METHOD OF SCREEN MANIPULATION USING HAPTIC ENABLE CONTROLLER - An interface system and a method for manipulating a display are disclosed. The interface system includes a display having a scroll area and a cursor presented thereon, a controller for manipulating a position of the cursor on the display, and a haptic device for generating a plurality of tactile feedbacks to a user through the controller, wherein a movement of the cursor across a peripheral edge of the scroll area of the display results in the haptic device generating a first tactile feedback of the plurality of tactile feedbacks representing a scroll mode, and wherein a movement of the cursor while the cursor is positioned within the scroll area of the display results in the haptic device generating a second tactile feedback of the plurality of tactile feedbacks representing a scroll rate of a visual feedback presented on the display.07-21-2011
20110095975MOBILE TERMINAL - A mobile terminal includes a body including a flexible portion, a display unit provided to the body, a sensing unit provided to the body and generating an electric signal in response to bending of the body, and a controller recognizing the electric signal and controlling the display unit according to the electric signal generated by the bending of the body.04-28-2011
20110084898MEMORY SHAPE ELEMENT FOR FLEXIBLE OLED DISPLAY SCREEN - An OLED display has one or more shape memory wires disposed along respective edges of the display and energizable under control of a processor to flatten the display from a rolled or folded configuration.04-14-2011
20110084899IMAGE DISPLAY APPARATUS AND METHOD FOR OPERATING THE SAME - An image display apparatus and a method for operating the same are disclosed. The image display apparatus includes a display, a user input interface for receiving a control signal from a remote controller and processing the received control signal, a network interface for transmitting or receiving data over a network, a controller for controlling a pointer on the display according to the control signal received from the remote controller, and a platform for controlling data transmission or reception over the network according to the control signal received from the remote controller. The platform includes an Operating System (OS) kernel and an application layer that runs on the OS kernel, and the application layer including an installable or deletable application downloaded over the network.04-14-2011
20110084897ELECTRONIC DEVICE - An electronic device is disclosed. The electronic device comprises a distance sensor for sensing a distance between the electronic device and a face of a user of the electronic device, an image sensor for providing an image of the face of the user, and a display for displaying text and/or graphical objects. The electronic device further comprises a control unit operatively connected to the display for controlling the displaying of text and/or a graphical object thereon, to the distance sensor for receiving distance data indicative of said distance, and to the image sensor for receiving image data representing said image. The control unit is adapted to control a font size of said text and/or a size of said graphical object based on the distance data and/or the image data.04-14-2011
20100127967MOBILE USER INTERFACE WITH ENERGY HARVESTING - A mobile user interface with energy harvesting is presented. In one embodiment, the mobile user interface comprises a striker, a piezoelectronic element to generate electric energy in response to being struck by the striker under control of an elastic mechanism, and a transmitter coupled to use the electric energy to transmit a signal wirelessly.05-27-2010
20110074673METHOD, APPARATUS FOR SENSING MOVED TOUCH AND COMPUTER READABLE RECORD-MEDIUM ON WHICH PROGRAM FOR EXECUTING METHOD THEREOF - Provided is a moved touch sensing method. The moved touch sensing method according to an exemplary embodiment of the present invention includes: acquiring channel information corresponding a touch inputted into a contact region; converting each of first channel information and second channel information acquired as the touch moves into order information according to a predetermined order; and outputting a movement distance clock signal representing a movement distance of the touch on the basis of the converted order information. According to exemplary embodiments of the present invention, it is possible to reduce a burden of a control processor by outputting information regarding a movement distance and a movement direction of a moved touch by using two clock signals.03-31-2011
20110074672USER INTERFACE DEVICE AND METHOD FOR CONTROLLING A CONNECTED CONSUMER LOAD, AND LIGHT SYSTEM USING SUCH USER INTERFACE DEVICE - The invention relates to a user interface device for controlling an electrical consumer, in particular, a light system. Further, it relates to light system using such user interface device. Moreover it relates to a method for controlling such light system using a user interface device. To provide a user interface device, a light system and a method for controlling a consumer load providing feed forward or feed-back information facilitating an easy and intuitive use of the user interface device when controlling a light system, a user interface device for controlling a connected light system (03-31-2011
20110074669Illuminating Controller having an Inertial Sensor for Communicating with a Gaming System - A controller for use in interfacing with a computer game. The controller includes a handle having at least one button, and a spherically shaped object connected to only one end of the handle. The spherically shaped object is defined from a translucent plastic material. Further included as part of the controller is an inertial sensor or accelerometer. The accelerometer may, as defined herein, be part of an inertial sensor. The controller also has an illuminating means defined within the spherically shaped object. A circuit is provided for interpreting input data from the at least one button and the inertial sensor, and is configured for communicating data wirelessly. The circuit is further configured to interface with the illuminating means to trigger illumination of the spherically shaped object to switch from an un-illuminated color to an illuminated color.03-31-2011
20110074665INFORMATION PROCESSING PROGRAM HAVING COMPUTER-READABLE STORAGE MEDIUM THEREIN AND INFORMATION PROCESSING APPARATUS - A game apparatus calculates a first evaluation value based on the difference between the time when the load value detected by a load controller becomes the maximum and the time when the velocity of the center of gravity, which represents the velocity of movement of the position of the center of gravity, becomes the maximum. The game apparatus calculates a second evaluation value based on the velocity of load, which represents the degree of increase in the load in a predetermined time period, and the velocity of the center of gravity. The game apparatus calculates a third evaluation value based on the path of the position of the center of gravity. The game apparatus calculates the amount of slice based on the first through third evaluation values.03-31-2011
20110095974DISPLAY DEVICE AND METHOD OF CONTROLLING DISPLAY DEVICE - A display device includes a flexible substrate, a display unit including a plurality of light-emitting elements arranged at the substrate and configured to display an image according to an image signal, a displacement sensor provided to a front surface or a back surface of the substrate and configured to detect a curved state of the substrate, and a pixel shift control unit configured to control pixel shifting of the image displayed in the display unit when a curve of the substrate is detected by the displacement sensor.04-28-2011
20110069004REMOTE CONTROLLER SUPPORTING SYSTEM AND METHOD FOR PROVIDING WEB SERVICE OPTIMIZED FOR REMOTE CONTROLLER - Provided is a remote controller supporting system and method that may provide a web service optimized for a remote controller of a user by displaying a webpage with different functions according to functions of each remote controller. The remote controller supporting system may include: a webpage storage unit to store a different webpage for each remote controller according to functions supportable by each remote controller; and a function controller to transmit remote controller group information containing remote controllers and webpage information corresponding to each of the remote controllers, and to transmit a webpage corresponding to a remote controller used in a system receiving the remote controller group information according to a request of the system.03-24-2011
20110069003STORAGE MEDIUM STORING INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - A game apparatus includes a CPU, and the CPU judges a motion of a player on the basis of a cycle of a load value input from a load controller, and selectively displays an animation of a player object according to the motion in a case that the motion by the player is a large-flapping motion, the CPU changes an updating velocity of an animation frame according to an arm-raising motion and arm-lowering motion of the large flapping motion. On the other hand, in a case that the motion by the player is a small-flapping motion, the CPU changes the updating velocity of the animation frame according to only the arm-lowering motion of the small-flapping motion. Thus, the motion of the player and the animation are synchronized.03-24-2011
20110074668CONTROL DEVICE - The present system relates to a device for imparting control to an application program. The device comprises a first sensor for carrying out a brainwave measurement when said first sensor is in contact with a user's head. The device further comprising a second sensor adapted to generate an output signal obtained from a measurement by the second sensor, when can for instance be a gyro sensor or a camera. The device is operable to use the output signal for imparting control to the application program in case the brainwave measurement falls outside a given interval of brainwave measurement values. The present system thus discloses a device that has an increased accuracy compared to the prior art devices.03-31-2011
20110074671IMAGE DISPLAY APPARATUS AND CONTROL METHOD THEREOF, AND COMPUTER PROGRAM - This invention provides a display apparatus which mounts a tilt sensor, and can control whether or not the user makes an image feed operation based on the tilt. An image display apparatus includes a display unit which displays image data recorded in a recording medium, an instruction accepting unit which accepts an instruction to make the image feed operation according to the tilt of the image display apparatus from the user, a tilt detection unit which detects the tilt of the image display apparatus with respect to a predetermined direction, and a display control unit which controls the display unit to display and switch the image data in accordance with a change in tilt detected by the tilt detection unit, when the instruction accepting unit accepts the instruction and the tilt detection unit detects the change in tilt.03-31-2011
20110074670Providing Input and Output for a Mobile Device - Providing input and output for a mobile device may be provided. At a mobile device, input may be received from at least one of a plurality of remote input devices. The plurality of remote input devices may be remote from the mobile device. The mobile device may have at least one local input device. The at least one of the plurality of remote input devices may have a greater form factor than the local input device. Next, the received input may be processed. The mobile device may transmit the output to at least one of the plurality of remote output devices. The plurality of remote output devices may be remote from the mobile device. The mobile device may have at least one local output device. The at least one of the plurality of remote output devices may have a greater form factor than the local output device.03-31-2011
20110074667SPECIFIC USER FIELD ENTRY - There is disclosed a method for controlling a user input in a computer system including a display, the computer system being adapted to receive inputs from a plurality of user input devices under the control of a system input device, the method comprising: a. detecting selection, by the system input device, of a modifiable displayed item; b. receiving modification data from at least one of the plurality of user input devices; and c. modifying the modifiable displayed item with the received modification data.03-31-2011
20120146893INTERCHANGEABLE OVERLAY FOR AMUSEMENT DEVICES - An overlay for a nonportable amusement device is provided. The nonportable amusement device has a housing, a display, a memory and a controller. The overlay includes one or more panels selectively attachable to and removable from the housing of the nonportable amusement device.06-14-2012
20120146896Continuous Determination of a Perspective - In a method and operating element for establishing an angle of view for an observer with respect to a two or three-dimensional object, which is displayed on an output device, the establishment of the angle of view takes place by control on a simple circular disc. A point on the disc is converted to a position on a virtual sphere. The respective angle of view with respect to the object is established by the axis, determined by the calculated position on the virtual sphere and the sphere center.06-14-2012
20110069002OPTO-ELECTRONIC SYSTEM FOR CONTROLLING PRESENTATION PROGRAMS - An input device for controlling a presentation program that is being run on a remote computing device is provided. The input device includes a first optical sensor configured to be activated by exposure to a focused beam of light and a second optical sensor configured to be activated by exposure to a focused beam of light. An RF communication device wirelessly delivers the instructional signals to the remote computing device to advance or reverse the presentation program.03-24-2011
20110043443Systems and methods for utilizing personalized motion control in virtual environment - Techniques for controlling motions using motion recognizers generated in advance by users are described. According to embodiment, the motion recognizers created by end users are utilized to control virtual objects displayed in a virtual environment. By manipulating one or more motion sensitive devices, end users could command what the objects to do in the virtual environment. Motion signals from each of the motion sensitive devices are recognized in accordance with the motion recognizers created in advance by the users. One or more of the motion signals are at the same time utilized to tune the motion recognizers or create additional motion recognizers. As a result, the motion recognizers are constantly updated to be more accommodating to the user(s)02-24-2011
20110057874Methods and Systems for Lingual Movement to Manipulate an Object - An intra-oral system is disclosed for assisting an individual in developing intra-oral muscle control and strength, and for facilitating typing of alphanumeric characters on a virtual keyboard. The system may also be used to enable an individual having limited use of the upper extremities to control an electrical apparatus such as a wheelchair, a bed or a light fixture. The intra-oral system includes a mouthpiece having a plurality of cells embedded therein. The cells are configured to receive pressure applied by the tongue of an individual. Movement of the tongue over and against the cells causes an object to be moved over a display. In one embodiment, the object is moved through an obstacle course or over a simulated track as part of a therapeutic regimen. In another embodiment, the object is moved over alphanumeric characters on a digital keyboard, and selected characters are typed by operation of the mouthpiece. In this manner, textual matter may be produced and stored by the user, and then sent via electronic means using a wired or wireless communication network. In yet another embodiment, a character or icon on the display is selected and activated to manipulate an electrical apparatus. A method for moving an electrical apparatus using a mouthpiece controlled through lingual movement is also provided.03-10-2011
20110050562VISUALIZATION CONTROLS - Implementations of visualization controls are described. Some techniques described herein enable a user to interact with a geoscience object on display. In one possible embodiment, simultaneous movements of two or more user-controlled points are tracked using a motion monitoring system, such as cameras. The movement of the points is then interpreted as an interaction with the geoscience object and results in a corresponding alteration of the display of the geoscience object. One example interaction includes a compound manipulation (i.e. translation, rotation, scaling and/or skewing) of the geoscience object.03-03-2011
20110043446COMPUTER INPUT DEVICE - Computer input apparatus comprising: an image capture device; and a marker member comprising at least two reference indicia, at least a first reference indicium being arranged to emit or reflect light having a first spectral characteristic, and at least a second reference indicium being arranged to emit or reflect light having a second spectral characteristic different from the first spectral characteristic, the image capture device being arranged to distinguish light of said first spectral characteristic from light of said second spectral characteristic thereby to distinguish the at least a first reference indicium from the at least a second reference indicium, the apparatus being configured to capture an image of the at least two reference indicia and to determine by means of said image a position and orientation of the marker member with respect to a reference frame.02-24-2011
20110043444PORTABLE ELECTRONIC DEVICE - An exemplary portable electronic device includes a main body, a cover movably connected with the main body, a piezoelectric sensing unit, and a signal processing unit. The piezoelectric sensing unit is partially positioned on the main body and the cover. The piezoelectric sensing unit generates elastic deformation due to the relative displacement of the cover and the main body to generate a corresponding command signal. The signal processing unit is electrically connected to the piezoelectric sensing unit, and capable of actuating different modes of the portable electronic device according to the command signal.02-24-2011
20110043442Method and System for Displaying Images on Moveable Display Devices - A display system includes a display device having multiple possible poses, including a neutral pose. A physical constraint maintains the display device in the neutral pose absent an application of an external force. A sensor measures a magnitude and direction of a displacement of the display device to a displaced pose due to the application of the external force. Then, the rendering engine renders an image on the display device according to the magnitude and direction of the displacement even while the display device remains constant in the displaced pose.02-24-2011
20100302136Method and apparatus for displaying three-dimensional stereo images viewable from different angles - A method and apparatus for displaying three-dimensional stereo images using a screen that displays multiple images, each representing objects seen from a particular angle, and a mask placed in front of the screen, containing holes or transparent areas, that allows multiple images to be viewed simultaneously, but only one image from any given direction.12-02-2010
20110025599DISPLAY APPARATUS AND METHOD FOR CONTROLLING DISPLAY APPARATUS - A display apparatus and a method for controlling the display apparatus are provided. If a display apparatus is in a power off state and an interface is connected to a cable, an output of a cable connection sensing signal is blocked. Accordingly, the display apparatus may be able to report its power off state to an external device.02-03-2011
20130162523SHARED WIRELESS COMPUTER USER INTERFACE - A system and method for use in a point-to-point enabled device includes establishing a wireless point-to-point connection with a remote device, transmitting a request message to the remote device requesting information to clone a user interface of the remote device at the point-to-point enabled device, receiving at the point-to-point enabled device the information to clone the user interface of the remote device, displaying on a display associated with the point-to-point enabled device a cloned image of the user interface of the remote device, receiving, at the point-to-point enabled device, user-entered input data associated with an application running on the remote device, and transmitting the user-entered input data from the point-to-point enabled device to the remote device via the wireless point-to-point connection. The transmitted user-entered input data is usable by the remote device as if that data was received at the remote device from a user of the remote device.06-27-2013
20100295776ePaper Stamp - A method and apparatus are provided for stamping a piece of ePaper. A grid is positioned within a selected distance to a first side of the piece of ePaper. A grounding pin conductively connects a conductive backing plate located on a second side of the piece of ePaper. The grounding pin completes a voltage path from the grid through the piece of ePaper to the conductive backing plate. A voltage is supplied to the grid and supplying the voltage to the grid changes the appearance of the piece of ePaper to form a stamped image.11-25-2010
20100295771CONTROL OF DISPLAY OBJECTS - Disclosed herein are systems and methods for controlling display objects. Particularly, a body part of a user may move, and the movement detected by a capture device. The capture device may capture images or frames of the body part at different times. Based on the captured frames, velocities of the body part may be determined or at least estimated at the different times. A blend velocity for the body part may be determined based on the different velocities. Particularly, for example, the blend velocity may be an average of the velocities of the body part over a period of time. A display object may then be controlled or moved in accordance with the blend velocity. For example, an avatar's body part may be moved in the same direction as a recent captured frame of the user's body part, and at the blend velocity.11-25-2010
20100295769Device for Controlling an External Unit - A device for controlling a click command controlled external unit including a portable head mounted frame, a click command detector mounted on the head mounted frame and adapted to sense tension changes of at least one muscle in the face of the user in order to detect when the user provide a click command, and a click command transmitter adapted to transmit information about detected click commands to the external unit.11-25-2010
20100117956DISPLAY SYSTEM AND METHOD FOR CONTROLLING AN ON-SCREEN DISPLAY WITHIN A DISPLAY SCREEN - A display system is used for controlling an on-screen display within a display device. The on-screen display is a text information including a literal string formed by combining several numbers of letters. The display system includes a memory device for storing pieces of letter information, a controller to download a coding chart upon initialization of the display device. The coding chart includes string-forming codes and letter-forming codes. The coding chart further includes groups of string-forming codes for encoding different literal strings. The string-forming codes and the letter-forming codes correspond to the letter information in the memory. Upon receipt of an external command corresponding to a specific string-forming code, the controller fetches a letter-forming code from the coding chart based on the specific string-forming code and letter information from the memory device based on the respective letter-forming code, thereby encoding and displaying the literal string over the display screen.05-13-2010
20110254761OPTICAL NAVIGATION DEVICES - An optical navigation device, such as that used on a computer or mobile communications device, includes a radiation source capable of producing a beam of radiation. A sensor receives an image. An optical element identifies movement of an elastic object on a first surface to thereby enable a control action to be carried out. The device further determines the relative pressure on a first surface by an elastic object based upon the value of an optical parameter, such as the average radiation intensity of the image received at the sensor. The device may be arranged to operate as a push button or a linear pressure sensor.10-20-2011
20110175802METHOD AND SYSTEM FOR OPERATING ELECTRIC APPARATUS - A method and a system for operating an electric apparatus are provided. In the present invention, first, an image capturing unit is enabled for capturing an image. Next, a palm component on the image is detected. Afterwards, a center of mass in the palm component is calculated according to a principal component analysis (PCA) algorithm, so as to use the center of mass to simulate a cursor. Then, a width of the palm of the palm component is calculated. Finally, the width of the palm is compared with a threshold so as to execute a click action if the width of the palm is greater than the threshold.07-21-2011
20110248913MOTIONBEAM INTERACTION TECHNIQUES FOR HANDHELD PROJECTORS - An image projection system may be configured to project objects which respond to movements and gestures made using a handheld projector, as well as to methods for controlling the projected objects based on such user input. For example, users may interact with and control objects in a projection frame by moving and/or gesturing with the handheld projector. Further, objects or characters projected using the handheld projector may be configured to perceive and react to physical objects in the environment. Similarly, elements of the physical environment may be configured to respond to the presence of the projected objects or characters in a variety of ways.10-13-2011
20110248914System and Method for Virtual Touch Typing - Systems, methods, and products are described for enabling a user to enter data into a device without a keyboard. A virtual keyboard in accordance with the invention may include a sensor to detect actual or intended finger movements or other changes in the user's physiology, an element that generates a sequence of ambiguous pseudo-words based on the physiological changes, and a translator that translates the pseudo-words into words in a natural language and provides the natural language words to the device or to a data storage unit. The device typically may be a computer, electronic notepad, personal digital assistant, telephone, or other electronic device.10-13-2011
20110248915METHOD AND APPARATUS FOR PROVIDING MOTION LIBRARY - A method and an apparatus for providing a motion library, adapted to a service end device to provide a customized motion library supporting recognition of at least one motion pattern for a user end device. At least one sensing component disposed on the user end device is determined. At least one motion group is determined according to the determined sensing components, wherein each motion group comprises at least one motion pattern. The at least one motion pattern is selected and a motion database to is queried to display a list of the motion groups corresponding to the selected motion patterns and the motion groups are selected from the list. The motion patterns belonging to the motion groups are selected to re-compile the customized motion library, which is provided for the user end device, so as to enable the user end device to recognize the selected motion patterns.10-13-2011
20110248912DISPLAY DEVICE HAVING THE CONNECTION INTERFACE WITH ILLUMINING AND INDICATING FUNCTIONS - A display device includes a connection interface and a light source. The light source illuminates the connection interface and indicates I/O ports of the connection interface. The light source can illuminate all I/O ports of the connection interface or the I/O port of the connection interface corresponding to the input signal source selected by the user.10-13-2011
20110248911STEREOSCOPIC IMAGE DISPLAY SYSTEM AND METHOD OF CONTROLLING THE SAME - The present invention discloses a stereoscopic image display system and a method of controlling the same. An eye tracking module locates current 3D spatial positions of the viewer's eyes, and generates the information of both left and right eyes' current 3D spatial positions. A control module controls a display device that can alter the direction of the light outputted, and outputs images on the display device in time multiplex mode. The light containing the left eye image is outputted to the position of left eye instead of right eye at one time point, and the light containing the right eye image is outputted to the position of right eye instead of left eye at another time point, so that a stereoscopic image is perceived according to the parallax theory. The present invention enlarges the visual range of stereoscopic image and achieves a better stereoscopic image visual experience for viewers.10-13-2011
20100097314DRAWING CONTROL APPARATUS AND DRAWING CONTROL METHOD OF ELECTRONIC PAPER - When a user inputs an instruction to display a subsequent screen from an operation unit 04-22-2010
20100097311Reaction Apparatus, Fuel Cell System and Electronic Device - The invention relates to a reaction apparatus that efficiently heats a reaction portion, and a fuel cell system and an electronic device that include such a reaction apparatus. A reaction apparatus (04-22-2010
20100097313LID STRUCTURE, APPARATUS AND METHOD FOR DISPLAYING GRAPHICAL INFORMATION - A lid structure, apparatus and method for displaying graphical information uses beams of coherent light that are emitted in a scanning manner to project the beams of coherent light onto a display surface to form the graphical information on the display surface.04-22-2010
20100053072Application control based on flexible interface conformation sequence status - A method includes, but is not limited to: obtaining information associated with one or more changes in one or more sequences of two or more conformations of one or more portions of one or more regions of the flexible interface and coordinating the one or more changes in one or more sequences of two or more conformations of one or more portions of one or more regions of the flexible interface with one or more commands. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure.03-04-2010
20120200490GAZE DETECTION APPARATUS AND METHOD - A gaze detection apparatus is disclosed. The gaze detection apparatus includes: a gaze detection section that detects gaze of a target person; a display section that includes a display screen for displaying an image; a gaze position determination section that determines, based on a result of detection by the gaze detection section, whether or not the display screen lies in the gaze of the target person; and a first display control section that displays a first detection result image at an intersection point between the gaze of the target person and the display screen when the first gaze position determination section determines that the display screen lies in the gaze of the target person.08-09-2012
20120200491GESTURE CATALOGING AND RECOGNITION - Methods and apparatus for cataloging and recognizing gestures are disclosed. A gesture may be detected using sample motion data. An energy value and baseline value may be computed. The baseline value may be updated if the energy value is below a calm energy threshold. The sample motion data may be adjusted based on the updated baseline value. A local variance may be calculated over a number of samples. Sample motion data values may be recorded if the local variance exceeds a threshold. Sample motion data recording may stop if a local variance scalar value falls below a drop threshold. Input Gestures may be recognized by computing a total variance for sample values in an Input Gesture; calculating a figure of merit using sample values from the Input Gesture and one or more Catalog Gestures; and determining whether the Input Gesture matches a Catalog Gesture from the figure of merit.08-09-2012
20120169585ELECTRONIC SHELF LABEL AND METHOD OF DISPLAYING REMAINING BATTERY LIFE THEREOF - An electronic shelf label system is provided. There is provided a method in which an electronic shelf label periodically transmits its remaining battery capacity to a server in a electronic shelf label system according to the present invention, and the server converts the battery level into a remaining battery life (time) and provides the same, so that a manager can check the remaining battery life in a management mode of the server and a terminal, and easily manage an electronic shelf label using a battery.07-05-2012
20120169584AIR CONDITIONING APPARATUS AND A METHOD FOR CONTROLLING AN AIR CONDITIONING APPARATUS - An air conditioning apparatus and a method for controlling an air conditioning apparatus are provided. The method may include turning on an image capturing device provided in an indoor device; recognizing a gesture identifier moved in front of the image capturing device; inputting a gesture according to a movement of the gesture identifier; analyzing the input gesture; extracting operating condition(s) corresponding to the analyzed gesture; and driving the indoor device according to the extracted operating condition(s).07-05-2012
20120169583SCENE PROFILES FOR NON-TACTILE USER INTERFACES - A method, including capturing an image of a scene including one or more users in proximity to a display coupled to a computer executing a non-tactile interface, and processing the image to generate a profile of the one or more users. Content is then selected for presentation on the display responsively to the profile.07-05-2012
20120169582SYSTEM READY SWITCH FOR EYE TRACKING HUMAN MACHINE INTERACTION CONTROL SYSTEM - The invention relates to a system and method for activating a visual control interface, and in particular, for activating a visual control interface using an eye tracking system in a vehicle. A switch is used to activate and deactivate a control section of an eye tracking system in a human-machine interaction control system. The system allows a driver (or operator) of a vehicle to signal the system with selection of the switch to activate or deactivate the control section, thereby providing functional support to the driver when desired, but remaining inconspicuous otherwise.07-05-2012
20090322672INTERACTIVE DISPLAY - An interactive display (12-31-2009
20120169591DISPLAY DEVICE AND CONTROL METHOD THEREFOR - One embodiment provides a display device, including: a luminous flux generator which generates luminous flux including image information; a reflector plate which reflects the luminous flux generated by the luminous flux generator toward one eye of a viewer; a head detector which detects a head of the viewer by using at least two pairs of distance sensors; a controller which controls a position of the reflector plate based on an output from the head detector; and a driver which drives the reflector plate based on an output from the controller.07-05-2012
20080246723Integrated button activation sensing and proximity sensing - Apparatuses and methods for coupling a group of sensor elements together in one mode to collectively measure a capacitance on the group of sensor elements, in addition to individually measuring a capacitance on each of the sensor elements in another mode. The touch-sensor buttons may be used individually for button-activation sensing, and the touch-sensor buttons may be used collectively for proximity detection. The touch-sensor buttons and a ground conductor that surrounds the touch-sensor buttons may also be collectively used for proximity detection. The apparatus may include a processing device, and a plurality of sensor elements that are individually coupled in a first mode for button-activation sensing and collectively coupled in a second mode for proximity sensing.10-09-2008
20120200494COMPUTER VISION GESTURE BASED CONTROL OF A DEVICE - A system and method are provided for controlling a device based on computer vision. Embodiments of the system and method of the invention are based on receiving a sequence of images of a field of view; detecting movement of at least one object in the images; applying a shape recognition algorithm on the at least one moving object; confirming that the object is a user hand by combining information from at least two images of the object; and tracking the object to control the device.08-09-2012
20120200493CONTROLLING DEVICE WITH SELECTIVELY ILLUMINATED USER INTERFACES - A controlling device using a source of energy, such as light energy, to provide the controlling device with a user interface having multiple, different visual appearances.08-09-2012
20090153472CONTROLLING A VIEWING PARAMETER - The invention relates to a method (06-18-2009
20110254762MACHINE INTERFACES - Apparatus for determining the movement of an object comprises a plurality of ultrasonic transducers 10-20-2011
20110254760Wireless Motion Processing Sensor Systems Suitable for Mobile and Battery Operation - The present invention relates to a combination of a 6-axis motion sensor having a 3-axis gyroscope and a 3-axis linear accelerometer, a motion processor and a radio integrated circuit chip (IC), wherein the intelligence in the motion processor enables the communication between the motion sensor, the radio IC and the external network. The motion processor also enables power savings by adaptively controlling the data rate of the motion sensor, depending on the amount or speed of the motion activity.10-20-2011
20120200489INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, INFORMATION PROCESSING PROGRAM, AND INFORMATION PROCESSING SYSTEM - An information processing device includes a controller that decides information to be displayed on a second display section based on an attribute of an operation subject displayed on a first display section when the operation subject is operated, and a communication section that transmits control information for causing the second display section to display the information decided by the controller to a device having the second display section.08-09-2012
20080252596Display Using a Three-Dimensional vision System - An interactive video display system allows a physical object to interact with a virtual object. A light source delivers a pattern of invisible light to a three-dimensional space occupied by the physical object. A camera detects invisible light scattered by the physical object. A computer system analyzes information generated by the camera, maps the position of the physical object in the three-dimensional space, and generates a responsive image that includes the virtual object. A display presents the responsive image.10-16-2008
20080252593Information Processing Apparatus, Method and Program - The present invention relates to an information processing apparatus, method and program for controlling highlighting on a screen on the basis of an operation on a touch panel overlaid on a display. A touch-panel operation determining unit 10-16-2008
20120200486Infrared gesture recognition device and method - A system for generating tracking coordinate information in response to movement of an information-indicating element includes an array (08-09-2012
20100321288POSITION INPUT DEVICE AND COMPUTER SYSTEM - A position input device is provided in which signals are transmitted from a position indicator, and signals transmitted from the position indicator are received by a position detector device. According to certain embodiments, an electrical double-layer capacitor, a charging circuit which charges the electrical double-layer capacitor, and a power transmission unit which relays and supplies to the charging circuit power supplied from a power supply unit external to the position indicator, are provided in the position indicator. In other embodiments the position input device has a built-in power supply unit, transmitting units, and a control unit for switching the transmitting units between energized and de-energized states. Also provided are position input systems and computer systems including the position input device, and methods of operating the position input device and the systems.12-23-2010
20100321287INTERFACE UNIT FOR GAME MACHINE AND GAME MACHINE - An interface unit 12-23-2010
20100321286MOTION SENSITIVE INPUT CONTROL - A motion sensitive input control configured to prevent unintended input caused by inadvertent movement of a computing device. In one embodiment, unintended input can be prevented by disregarding an input event if a change in motion of the computing device is detected simultaneously with or immediately prior to the detected input event. In another embodiment, unintended input can be prevented by reducing the sensitivity of an input device during a motion-based state associated with the computing device. In this manner, the likelihood of inadvertent motion of a computing device causing an unintended input event can be reduced.12-23-2010
20080204405Method and System of Displaying an Exposure Condition - There is provided a device which may easily and visually judge which chip in an FEM wafer has a normal exposure condition, or which chip has an abnormal exposure condition.08-28-2008
20080204407COMPUTER-READABLE STORAGE MEDIUM HAVING STORED THEREIN INFORMATION PROCESSING PROGRAM AND INFORMATION PROCESSING APPARATUS - A controller includes imaging means for capturing predetermined imaging targets and acceleration detecting means for detecting an acceleration applied to an input device. A game apparatus repeatedly calculates a target position which is related to images, included in a captured image captured by the imaging means, of the imaging targets and which is provided in the captured image, and then senses a predetermined motion (a motion in an upward direction) of the input device based on the target position calculated within a predetermined time period. Further, the game apparatus senses the predetermined motion of the input device based on the acceleration detected by the acceleration detecting means. Furthermore, the game apparatus executes a predetermined process (a process of causing a player character to perform a jump) when the predetermined motion has been sensed in at least either one of a first motion determining step and a second motion determining step.08-28-2008
20080204406COMPUTER-READABLE STORAGE MEDIUM HAVING STORED THEREIN INFORMATION PROCESSING PROGRAM AND INFORMATION PROCESSING APPARATUS - A controller includes imaging means for capturing predetermined imaging targets and acceleration detecting means for detecting an acceleration applied to an input device. Based on a tilt which is related to images, included in a captured image captured by the imaging means, of the imaging targets and which is included in the captured image, a game apparatus calculates, as a first tilt, a tilt of the controller which is related to a rotation around an axis of a capturing direction of the imaging means. Further, based on the acceleration detected by the acceleration detecting means, the game apparatus calculates, as a second tilt, a tilt which is related to a rotation around an axis of a direction different from the capturing direction. The game apparatus executes a predetermined process using the first tilt and the second tilt as an orientation of the controller.08-28-2008
20080204404Method of Controlling a Control Point Position on a Command Area and Method For Control of a Device - The invention describes a method of controlling a position (x′, y′) of a control point (c) on a command area (A08-28-2008
20110260961SERIES CONNECTED ELECTROCHROMIC DEVICES - An electrochromic device includes a first electrochromic region interconnected with a second electrochromic region by a plurality of conductive links disposed between sides of a substrate on which the material layers of the electrochromic device are formed. The plurality of conductive links interconnects a first isolated conductive region of the first electrochromic region with a first isolated conductive region of the second electrochromic region. A sequence of a counter electrode layer, an ion conductor layer and an electrochromic layer is sandwiched between the first conductive regions of the first and second electrochromic regions and respective second isolated conductive regions of the first and second electrochromic regions. The second conductive regions of the first and second electrochromic regions are connected to respective first and second bus bars which are for connection to a low voltage electrical source.10-27-2011
20110163947Rolling Gesture Detection Using a Multi-Dimensional Pointing Device - A system and a method for performing a rolling gesture using a multi-dimensional pointing device. An initiation of a gesture by a user of the multi-dimensional pointing device is detected. A rolling gesture metric corresponding to performance of a rolling gesture comprising rotation of the multi-dimensional pointing device about a longitudinal axis of the multi-dimensional pointing device is determined. Information corresponding the rolling gesture metric is conveyed to a client computer system, wherein the client computer system is configured to manipulate an object in a user interface of the client computer system in accordance with the rolling gesture metric.07-07-2011
20110163948METHOD SYSTEM AND SOFTWARE FOR PROVIDING IMAGE SENSOR BASED HUMAN MACHINE INTERFACING - Disclosed is a method system and associated modules and software components for providing image sensor based human machine interfacing (“IBHMI”). According to some embodiments of the present invention, output of an IBHMI may be converted into an output string or into a digital output command based on a first mapping table. An IBHMI mapping module may receive one or more outputs from an IBHMI and may reference a first mapping table when generating a string or command for a first application running the same or another functionally associated computing platform. The mapping module may emulate a keyboard, a mouse, a joystick, a touchpad or any other interface device compatible, suitable or congruous with the computing platform on which the first application is running.07-07-2011
20110163946SIMULATION OF THREE-DIMENSIONAL TOUCH SENSATION USING HAPTICS - An apparatus includes a processing system, a display, and a plurality of haptic actuators. The display and the haptic actuators are coupled to the processing system. The processing system is configured to control the haptic actuators to simulate movement in a particular direction corresponding to movement in the particular direction in a visual depiction in the display.07-07-2011
20130169521DISPLAY DEVICE FOR PATTERN RECOGNITION INPUT - A display device includes: a substrate, a plurality of signal lines disposed on the substrate, at least one insulating layer disposed on the substrate, and a plurality of location references disposed on the substrate and in the same layer level as at least one of the signal lines, wherein arrangement of the plurality of location references varies depending on relative locations of the location references on a screen of the display device.07-04-2013
20130169522HARD KEY AND VEHICLE TERMINAL USING THE SAME - A disclosed is a vehicle terminal, which is capable of using hard keys configured to be arranged and modified according to a user's taste. In particular, the spatial arrangement of the hard keys may be changed, as well as their appearance.07-04-2013
20110025600METHOD OF INDICATING ADDITIONAL CHARACTER COMBINATION CHOICES ON A HANDHELD ELECTRONIC DEVICE AND ASSOCIATED APPARATUS - A method and associated apparatus for indicating additional character combination choices from a disambiguation function on a handheld electronic device.02-03-2011
20100283725MANIPULATING DEVICE AND PORTABLE ELECTRONIC APPARATUS - A manipulating device suitable for connecting to a body of a portable electronic apparatus is provided, wherein the body has a first connecting unit. The manipulating device includes a manipulating body, a pivotal shaft, a fastener and a second connecting unit. The manipulating body has a button unit including a plurality of keys. The pivotal shaft connects the manipulating body. The fastener connects the pivotal shaft and is pivoted on the manipulating body through the pivotal shaft and is connected to the body. The second connecting unit is disposed at the fastener. When the fastener is connected to the portable electronic apparatus, the second connecting unit connects the first connecting unit.11-11-2010
20100283727SYSTEM AND METHOD FOR SHAPE DEFORMATION AND FORCE DISPLAY OF DEVICES - Various systems, devices, and methods for shape deformation of a haptic deformation display device are provided. For example, the haptic deformation display device may receive an input signal when the shape of the haptic deformation display device is in a first shape configuration. In response to the input signal, the haptic deformation display device may activate an actuator of the haptic deformation display device. The actuator may move a deformation component of the haptic deformation display device. The deformation component may at least partially defining a shape of the haptic deformation display device, thereby causing the shape of the haptic deformation display device to deform into a second shape configuration different from the first shape configuration. The second shape configuration may be substantially maintained.11-11-2010
20100283726 USER INTERFACES AND ASSOCIATED APPARATUS AND METHODS - Apparatus for a portable electronic device, the apparatus arranged to detect substantially simultaneous user input from device user interface elements arranged to detect opposing user input, and to provide signalling, upon the detection of said opposing user input, for transferring electronic information content associated with the electronic device to a remote apparatus which is associatable with the electronic device. Apparatus for a portable electronic device, the apparatus arranged to detect movement of the device, relative to a remote apparatus which is associatable with the electronic device, as user input from the device, and to provide signalling, upon the detection of said relative movement, for transferring electronic information content associated with the electronic device to the remote apparatus which is associatable with the electronic device.11-11-2010
20100283728OBJECT, METHOD AND SYSTEM FOR TRANSMITTING INFORMATION TO A USER - An object (11-11-2010
20100283724DAMPING DEVICE CAPABLE OF PROVIDING INCREASED STIFFNESS - Damping device to impose a reaction to the displacement of a manual operating device (11-11-2010
20110134024DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A display apparatus and a control method thereof are disclosed. The display apparatus includes: a connection unit which is to be connected with at least one external device; a signal processing unit which processes an image signal; a display unit which displays an image on the basis of the image signal processed by the signal processing unit; a storage unit which stores therein a setting menu regarding a plurality of functions of the display apparatus; and a controller which controls the signal processing unit so that the setting menu regarding at least one of the plurality of functions of the display apparatus corresponding to the external device which is connected to the connection unit, is displayed through the display unit, and so that the image is displayed through the display unit according to user's setting through the setting menu.06-09-2011
20110134028COMMUNICATION TERMINAL DEVICE, COMMUNICATION METHOD, AND COMMUNICATION PROGRAM - A communication terminal device includes: a display for displaying image information; a communication device for transmitting and receiving information to and from another terminal via a network; an input device for entering command information and image information, and a processor configured to perform a first control for causing the display to show, based on input of first command information from the input device during display of a first image, a second image, and transmitting first information to the other terminal via the communication device and perform a second control for causing the display to show the first image based on input of second command information from the input device during display of the second image, and transmitting second information to the other terminal via the communication device.06-09-2011
20110134026IMAGE DISPLAY APPARATUS AND METHOD FOR OPERATING THE SAME - In accordance with an aspect of the present invention, a method for controlling an image display apparatus includes obtaining, by the image display apparatus, emotion information associated with a user of the image display apparatus, determining a contents to be recommended for the user among a plurality of different contents, based on the obtained emotion information associated with the user, and selectively or automatically reproducing one of the recommended contents at the image display apparatus.06-09-2011
20120206342SYSTEM AND METHOD FOR ACCESSING AND PRESENTING LIFE STORIES - A system to access information about a deceased individual includes a memory device and a mobile unit. The memory device contains a non-volatile computer readable medium to store the information on the deceased individual. The mobile unit interfaces with the memory device to access the information. The memory device may include a memory tube or a microchip. In some versions, the memory device may be mounted to a headstone or associated with the gravesite. The mobile unit may include a viewing screen to display information about the deceased individual via a user interface. In another version, a computer may host the information and an identifier is used by a mobile unit to access the information. The identifier may be associated with a burial site. In some versions, the identifier is an electronic identifier that communicatively couples to the computer.08-16-2012
20110084900HANDHELD WIRELESS DISPLAY DEVICE HAVING HIGH-RESOLUTION DISPLAY SUITABLE FOR USE AS A MOBILE INTERNET DEVICE - A handheld wireless display device, having at least SVGA-type resolution, includes a wireless interface, such as BluetoothrM, WiFi™, Wimax™, cellular or satellite, to allow the device to utilize a number of different hosts, such as a cell phone, personal computer, media player. The display may be monocular or binocular. Input mechanisms, such as switches, scroll wheels, touch pads, allow selection and navigation of menus, playing media files, setting volume and screen brightnesdcontrast, activating host remote controls or performing other commands. The device may include MIM diodes, Hall effect sensors, or other position transducers and/or accelerometers to detect lateral movements along and rotational gestures around the X, Y and Z axes as gesture inputs and movement queues. These commands may change pages, scroll up, down or across an enlarged screen image, such as for web browsing. An embedded software driver permits replicating a high-resolution screen display from a host PC.04-14-2011
20120200492Input Method Applied in Electronic Devices - An input method applicable for inputting into an electronic device, which includes the steps of capturing a lip motion of a person; receiving an image of the lip motion; encoding the lip motion image to obtain a lip motion code; comparing the lip motion code with a plurality of standard lip motion codes to obtain a first text result matching the lip motion code; and displaying the first text result on the electronic device if the first text result is obtained. If the first text result is not obtained, the method may further include activating an auxiliary analyzing mode for the electronic device for recognizing a facial expression, a hand gesture, or an audio signal to be inputted. The input method can diversify input methods for the electronic device.08-09-2012
20120200488AR GLASSES WITH SENSOR AND USER ACTION BASED CONTROL OF EYEPIECE APPLICATIONS WITH FEEDBACK - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes sensor and user action based control of eyepiece applications with feedback.08-09-2012
20110090144SYSTEM DELAY MITIGATION IN INTERACTIVE SYSTEMS - A method sends a signal to render visual information on a display, and receives a user response to the rendered visual information. The user response includes a first delay. The method also queries an electronic system for data indicating a second delay. The second delay is a portion of the first delay and attributable to the electronic system. The method further using the data indicating the second delay to compensate for electronic system delay during interactions with a user.04-21-2011
20090073112METHOD AND SYSTEM FOR DYNAMICALLY CONFIGURABLE TACTILE FEEDBACK FOR NAVIGATIONAL SUPPORT - A method for providing dynamically configurable tactile indicator signals on a control surface for navigational support, includes: detecting at least one of an operator's hands positioned on a control surface; routing control signals to a series of tactile sensors in proximity to the detected positions of at least one of the operator's hands; wherein the control signals actuate tactile feedback devices; wherein the control signals are based on navigational information; providing tactile feedback to the operator via the actuated tactile feedback devices; wherein the tactile feedback is dynamically configured in response to the number of operator hands detected on the control surface; and wherein the tactile feedback is dynamically configured in response to the position of at least one of the operators hands on the control surface.03-19-2009
20110018796ELECTRONIC DEVICE - In a car stereo system, when an object such as a hand comes close to a sensing range of a proximity sensor provided near an illumination button while a display section and lights of an operation section that are provided on a front face of the car stereo system are in a state of complete non-lighting, only the light of the illumination button is turned on; if the illumination button is operated within the predetermined period of time, then the display section and the lights of the operation section are turned on; and if the illumination button is not operated within the predetermined period of time, then the light of the illumination button is turned off so that the state of complete non-lighting is entered again.01-27-2011
20080273009Information Reproducing Apparatus and Method, Dj Device, and Computer Program - An information reproducing apparatus (11-06-2008
20110163944INTUITIVE, GESTURE-BASED COMMUNICATIONS WITH PHYSICS METAPHORS - A user can make an intuitive, physical gesture with a first device, which can be detected by one or more onboard motion sensors. The detected motion triggers an animation having a “physics metaphor,” where the object appears to react to forces in a real world, physical environment. The first device detects the presence of a second device and a communication link is established allowing a transfer of data represented by the object to the second device. During the transfer, the first device can animate the object to simulate the object leaving the first device and the second device can animate the object to simulate the object entering the second device. In some implementations, in response to an intuitive, gesture made on a touch sensitive surface of a first device or by physically moving the device, an object can be transferred or broadcast to other devices or a network resource based on a direction, velocity or speed of the gesture.07-07-2011
20110163945PORTABLE DEVICE FOR CONTROLLING INSTRUCTION EXECUTION BY MEANS OF ACTUATORS PLACED ON A REAR SURFACE - The invention relates to a device (D) for controlling the execution of instructions, that comprises: i) a receptacle (RP) that can be held by a user in at least one hand and has a rear surface (FAR) provided with rear actuators (AC); ii) a storing means (MS) capable of storing, for at least one application, a table of correspondence between at least one operation mode and a set of selected instructions associated with icons of relative positions defined according to a selected arrangement; and ii) control means (MC) for, in case an application operation mode is selected, determining the table corresponding thereto and associating the instructions contained in said table to actuation types of the rear actuators (AC) selected on the basis of the relative positions of icons respectively associated with these instructions, at least some of said icons being displayed on at least one screen (EC07-07-2011
20120146895ARRANGEMENT, METHOD AND COMPUTER PROGRAM FOR CONTROLLING A COMPUTER APPARATUS BASED ON EYE-TRACKING - A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.06-14-2012
20120146892INFORMATION INPUT DEVICE AND INFORMATION INPUT METHOD - An information input device includes: an applying element having a ring shape so that an finger of a user is inserted into the applying element; a receiving element having a ring shape so that an finger of a user is inserted into the receiving element and disposed adjacent to the applying element in an extending direction of a center line of the applying element; a signal generating element generating and transmitting a waveform signal to the apply element so that the applying element outputs a measurement signal; and a signal extracting element extracting a signal relating to a posture of the finger from a reception signal, which is output from the receiving element based on the measurement signal received by the receiving element.06-14-2012
20120146890HAPTIC ROCKER BUTTON FOR VISUALLY IMPAIRED OPERATORS - In some embodiments, a device includes a converter, a haptic rocker button, a sensor, and a controller. In some embodiments, a method include receiving data in a mobile device, converting the data into Braille content, presenting the Braille content on a haptic rocker button, and controlling a horizontal movement of the Braille content on the haptic rocker button.06-14-2012
20110134025INPUT DEVICE - A plurality of cylindrical-shaped or spherical-shaped magnets each including the N-pole and the S-pole formed at a predetermined angle interval are rotatably placed between upper case and lower case, and a plurality of magnetic detection elements are disposed opposite to the magnets at a predetermined gap. When a plurality of small magnets having a diameter of about 2 to 3 mm are rotated by the finger, from the rotation direction and the rotation angle of the magnets the operation direction and the operation amount of the finger can be detected. Therefore, it is possible to obtain an input device capable of making an entire input device with a low dimension so as to make the device thin, and capable of reliable operation.06-09-2011
20110043445Handheld electronic device and method of controlling the handheld electronic device according to state thereof in a three-dimensional space - A method of controlling a handheld electronic device according to state thereof in a three-dimensional space, which is applicable to the handheld electronic device including a CPU, a displacement sensor, and a quadrant section lookup table. The displacement sensor detects variation of the state of the handheld electronic device in a three-dimensional space and generates a current state signal accordingly. The lookup table lists a plurality of quadrant sections of the three-dimensional space, wherein each quadrant section corresponds to a function program or control command. The CPU receives the current state signal and calculates a current space signal accordingly. After determining the quadrant section in the lookup table that corresponds to the current space signal, the CPU activates the function program or control command that corresponds to the quadrant section. Thus, a user can execute the desired function programs or control commands conveniently by moving the handheld electronic device single-handedly.02-24-2011
20110260963SYMBOLIC INPUT VIA MID-AIR FINGER/THUMB MOTIONS - An information system includes primary inductance coils driven by an energy source, secondary inductance coils, a processor, a display device, and a lookup table. The processor determines an inductance value generated when coils are brought into proximal interaction with each other. The processor extracts associates alphanumeric or other symbolic information from the lookup table, and transmits the symbolic information to the display device. A circuit is also provided that generates the symbolic information for presentation via the display device, and includes an electrically-driven inductance coil positionable on a thumb of a user, passively-driven inductance coils positionable on the various phalanges of the user's fingers, the processor, and the lookup table. A method for generating and recording symbolic information includes determining the inductance value, associating the inductance value with corresponding symbolic information in the lookup table, and transmitting the symbolic information to a display device.10-27-2011
20110260966Fingerprint reader device and electronic apparatus - The present invention relates to a slide-type fingerprint reader device in which a finger is slid on the fingerprint sensor, and a positioning structure that causes a first joint portion of a finger to be positioned at the center point of the fingerprint sensor includes a sensor movement mechanism, which keeps the fingerprint sensor at a position bulging from a slide surface on which the finger is slid when the fingerprint sensor is not pressed down, and which allows the fingerprint sensor to move upon being pressed down.10-27-2011
20110260965APPARATUS AND METHOD OF USER INTERFACE FOR MANIPULATING MULTIMEDIA CONTENTS IN VEHICLE - Disclosed are provided an apparatus and a method of a user interface for manipulating multimedia contents for a vehicle. An apparatus of a user interface for manipulating multimedia contents for a vehicle according to an embodiment of the present invention includes: a transparent display module displaying an image including one or more multimedia objects; an ultrasonic detection module detecting a user indicating means by using an ultrasonic sensor in a 3D space close to the transparent display module; an image detection module tracking and photographing the user indicating means; and a head unit judging whether or not any one of the multimedia objects is selected by the user indicating means by using information received from at least one of the image detection module and the ultrasonic detection module and performing a control corresponding to the selected multimedia object.10-27-2011
20110260964METHOD AND APPARATUS FOR CONTROLLING A DISPLAY TO GENERATE NOTIFICATIONS - The present specification provides a method and apparatus for controlling a display based on signals received from one or more input devices. In one implementation, a mobile device with a touch screen and a touch pad is provided. A notification module executable on the mobile device configures the processor of the mobile device to control the display to generate a notification bar and a content region. The notification bar contains an icon representing each application from which a notification has been generated and a number adjacent the icon for indicating how may notifications have been generated by the application. The content region includes data associated with the notifications, and is arranged in rows beneath a header identifying the application. The layout of the applications may be varied as well as the priority of ordering the application sin the content region.10-27-2011
20110109538ENVIRONMENT SENSITIVE DISPLAY TAGS - This is directed to dynamic tags or screen savers for display on an electronic device. The tags can include several dynamic elements that move across the display. The particular characteristics of the elements can be controlled in part by the output of one or more sensors detecting the environment of the device. For example, the color scheme used for a tag can be selected based on the colors of an image captured by a camera, and the orientation of the movement can be selected from the output of a motion sensing component. The tag can adjust automatically based on the sensor outputs to provide an aesthetically pleasing display that a user can use as an fashion accessory.05-12-2011
20110175804EVENT GENERATION BASED ON PRINT PORTION IDENTIFICATION - An optical scanner is configured to scan multiple print portions of a body part such as a finger. The optical scanner identifies a first one of the print portions in an area of an optical surface. An event such as launching an application is generated based on identifying the first print portion in the area of the optical surface. In addition, various events can be generated based on different combinations of print portions in different areas of the optical surface.07-21-2011
20100214211HANDHELD ELECTRONIC DEVICE HAVING GESTURE-BASED CONTROL AND A METHOD OF USING SAME - The present disclosure describes a handheld electronic device having a gesture-based control and a method of using the same. In one embodiment, there is provided a method of controlling a handheld electronic device, comprising: receiving a motion signal as input from a motion detection subsystem in response to a movement of the device; determining from the motion signal a cadence parameter associated with the movement of the electronic device; determining whether the cadence parameter is greater than or equal to a cadence reference level; performing a first command when the cadence parameter is greater than or equal to the cadence reference level; and performing a second command is performed when the cadence parameter is less than the cadence reference level.08-26-2010
20080218473Enhanced Artificial Intelligence Language - A method of determining an appropriate response to an input includes linking a plurality of attributes to a plurality of response templates using a plurality of Boolean expressions. Each attribute is associated with a set of patterns. Each pattern within the set of patterns is equivalent. The method also includes determining an appropriate response template from the plurality of response templates based on the input.09-11-2008
20080218472INTERFACE TO CONVERT MENTAL STATES AND FACIAL EXPRESSIONS TO APPLICATION INPUT - A method of interacting with an application includes receiving, in a processor, data generated based on signals from one or more bio-signal detectors on a user, the data representing a mental state or facial expression of the user, generating an input event based on the data representing the mental state or facial expression of the user of the user, and passing the input event to an application.09-11-2008
20100066668MECHANISM FOR DISPLAYING PAGINATED CONTENT ON ELECTRONIC DISPLAY DEVICES - A computing device is provided that includes a display comprising a plurality of discrete elements. A memory is used to store a data collection of paginated content. A processor of the computing device is configured to retrieve each of the pages from the memory. The processor signals the display to individually present each of the pages. A sensor device is coupled to the processor. The sensor device is deflectable to signal the processor a deflection value that causes the processor to sequentially present at least portions of multiple pages on the display.03-18-2010
20100066667ORIENTING A DISPLAYED ELEMENT RELATIVE TO A USER - An element is initially displayed on an interactive touch-screen display device with an initial orientation relative to the interactive touch-screen display device. One or more images of a user of the interactive touch-screen display device are captured. The user is determined to be interacting with the element displayed on the interactive touch-screen display device. In addition, an orientation of the user relative to the interactive touch-screen display device is determined based on at least one captured image of the user of the interactive touch-screen display device. Thereafter, in response to determining that the user is interacting with the displayed element, the initial orientation of the displayed element relative to the interactive touch-screen display device is automatically adjusted based on the determined orientation of the user relative to the interactive touch-screen display device.03-18-2010
20100066664WRIST-WORN INPUT APPARATUS AND METHOD - Provided are a wrist-worn input apparatus and method. The apparatus and method can segment meaningful hand gestures using a vibration sound generated by a finger tapping and can issue commands corresponding to the recognized hand gestures, enabling the user to easily input commands to electronic devices.03-18-2010
20100194684METHODS AND SYSTEMS FOR PROVIDING PROGRAMMABLE COMPUTERIZED INTERACTORS - A computerized interactor system uses physical, three-dimensional objects as metaphors for input of user intent to a computer system. When one or more interactors are engaged with a detection field, the detection field reads an identifier associated with the object and communicates the identifier to a computer system. The computer system determines the meaning of the interactor based upon its identifier and upon a semantic context in which the computer system is operating.08-05-2010
20100194682METHOD FOR TAP DETECTION AND FOR INTERACTING WITH A HANDHELD ELECTRONIC DEVICE, AND A HANDHELD ELECTRONIC DEVICE CONFIGURED THEREFOR - A method for tap detection and for interacting with a handheld electronic device, and a handheld electronic device configured therefore are described. In accordance with one embodiment, there is provided a method for tap detection on a handheld electronic device, comprising: measuring acceleration using an accelerometer of the handheld electronic device; determining when measured acceleration exceeds an upper limit threshold and a lower limit threshold within a predetermined duration of each other; when the upper limit threshold and lower limit threshold have been exceeded, determining a rate of change of acceleration between the upper limit threshold and lower limit threshold and registering a tap input when the rate of change of acceleration exceeds a predetermined tap threshold.08-05-2010
20100194680INFORMATION PROCESSING APPARATUS - According to an aspect of the invention, an information processing apparatus includes: a main body comprising a top face; a display module configured to be connected to the main body and configured to be rotatable between a closed position and an open position, the display module covering the top face at the close position and exposing the top face at the open position; a wireless communication module in the main body; a plurality of input devices in the main body; a processing module configured to perform a process corresponding to a manipulation signal that is send from the input devices to the processing modules; an input device setting module configured to select one of the input devices from the input devices as a selected input device; and an input manipulation control module configured to control the processing module so as to prevent the processing module from performing a processing corresponding to a manipulation signal received by the selected input device while the wireless communication module performs a wireless communication.08-05-2010
20100194681MANIPULATION DEVICE FOR NAVIGATING VIRTUAL MICROSCOPY SLIDES/DIGITAL IMAGES AND METHODS RELATED THERETO - Featured is a manipulation device the ability to navigate virtual microscopy slides. The device includes an inverted light emitting diode (LED) reflecting light off a textured slide to a complimentary metal oxide semiconductor (CMOS) sensor that indicates the movement of the slide. The slide is freely moved by hand or traditional X-Y-mechanical stage on a raised platform akin to a slide stage. Finger touch controls are provided to zoom to higher or lower power images. The device plugs into a standard computer system by USB port running software to image a virtual microscope slide. Also featured are systems and methods related thereto.08-05-2010
20100194677MAPPING OF PHYSICAL CONTROLS FOR SURFACE COMPUTING - Physical controls on a physical controller device (PCD) are dynamically mapped to application controls for an application being executed on a computer having a touch-sensitive display surface. The computer identifies a PCD which has been placed by a user on the display surface and displays a mapping aura for the PCD. When the user touches an activate direct-touch button displayed within the mapping aura, the computer activates a mapping procedure for the PCD and displays a highlighted direct-touch button over each application control which is available to be mapped to the physical controls on the PCD. When the user selects a particular application control which is available to be mapped by touching the highlighted button residing over the control, the computer creates a dynamic mapping between the selected application control and a user-selected physical control on the PCD.08-05-2010
20100194679GESTURE RECOGNITION SYSTEM AND METHOD THEREOF - A gesture recognition system includes an image pick-up device, a processor, an operation engine, an optimal template selection means, and a display terminal. The image pick-up device is for capturing an image containing a natural gesture. The processor is for finding out a skin edge of a skin part from the image, and then classifying the skin edge into multiple edge parts at different angles. The operation engine has multiple parallel operation units and multiple gesture template libraries of different angle classes. These parallel operation units respectively find out gesture templates most resembling the edge parts in the gesture template libraries of different angle classes. The optimal template selection means selects an optimal gesture template from the resembling gesture templates found out by the parallel operation units. The display terminal is for displaying an image of the optimal gesture template. Thereby, marker-less and real-time gesture recognition is achieved.08-05-2010
20100194676KVM switch and computer readable medium - A KVM switch that is connected between servers, and at least one set of a keyboard, a mouse and a monitor, comprising: an acquiring portion that acquires information showing a screen resolution to which the monitor is capable of adapting, from the monitor, an analysis portion that analyzes a screen resolution of a video signal output from a corresponding server, based on a horizontal synchronizing signal and a vertical synchronizing signal received from each of the servers; a determination portion that determines whether the analyzed screen resolution exceeds the screen resolution shown by the acquired information; a conversion portion that, when the analyzed screen resolution exceeds the screen resolution shown by the acquired information, converts the analyzed screen resolution into the screen resolution shown by the acquired information; and an output portion that outputs the video signal having the converted screen resolution to the monitor.08-05-2010
20090284465CAPACITIVE MOTION DETECTION DEVICE AND INPUT DEVICE USING THE SAME - When the mode is switched to a motion detection mode, using a changeover switch, the mode is switched to the motion detection mode by pressing the changeover switch. In this mode, motion detection is performed by moving a hand in an area to be operated. When the mode is switched from the motion detection mode to a normal mode, the hand is moved away from the area to be operated, or the changeover switch is again pressed. Moreover, when the hand is distant from a capacitive sensor, it is determined that a motion input operation is being performed. When the hand is close to the capacitive sensor, it is determined that no motion input operation is being performed, and thus the motion detection mode is changed.11-19-2009
20110187637Tactile Input Apparatus - A tactile input apparatus comprising ring-shaped elements configured to conform to a user's fingers, tactile capacitance sensors operably affixed to periphery of the ring-shaped elements, and a control unit, is provided. Each tactile capacitance sensor is positioned parallel to the underside of each finger and reads a change in capacitance on contacting the user's thumb or palm. The control unit, in wired or wireless electronic communication with each tactile capacitance sensor and with the user's computing device, continuously transmits capacitance readings multiple times per second to a software on the computing device. The software monitors and processes the capacitance readings from the control unit and controls output to the computing device. The software determines logic and then enacts single or multiple custom outputs. The positioning of the tactile capacitance sensors on the periphery of the ring-shaped elements prevents confinement of the user's fingers and allows mobility of the user's fingers.08-04-2011
20100259474ENHANCED HANDHELD SCREEN-SENSING POINTER - Enhanced handheld screen-sensing pointing, in which a handheld device captures a camera image of one or more fiducials rendered by a display device, and a position or an angle of the one or more fiducials in the captured camera image is determined. A position on the display device that the handheld device is aimed towards is determined based at least on the determined position or angle of the one or more fiducials in the camera image, and an application is controlled based on the determined position on the display device.10-14-2010
20100177035Mobile Computing Device With A Virtual Keyboard - The subject matter disclosed herein provides methods and apparatus, including computer program products, for mobile computing. In one aspect there is provided a system. The system may include a processor configured to generate at least one image including a virtual keyboard and a display configured to project the at least one image received from the processor. The at least one image of the virtual keyboard may include an indication representative of a finger selecting a key of the virtual keyboard. Related systems, apparatus, methods, and/or articles are also described.07-15-2010
20110148754PROJECTION APPARATUS, DISPLAY APPARATUS, INFORMATION PROCESSING APPARATUS, PROJECTION SYSTEM AND DISPLAY SYSTEM - A communication terminal is allowed to make data communication by simple processing. A communication unit is allowed to make data communication with a communication terminal when the detected position of the communication terminal remains unchanged for a predetermined time period or longer, within a range in which an image is projected.06-23-2011
20110148752Mobile Device with User Interaction Capability and Method of Operating Same - In one embodiment a method of operating a mobile device includes sensing either an orientation or a movement of the mobile device, determining a command based on the sensed orientation or sensed movement, sensing a proximity of an object in relation to at least a portion of the mobile device, and executing the command upon the proximity of the object being sensed. In another embodiment, a method of operating a mobile device governs a manner of interaction of the mobile device relative to one or more other mobile devices. In at least some embodiments, at least one of the mobile devices includes an accelerometer and an infrared proximity sensor, and operation of the mobile device is determined based upon signals from those components.06-23-2011
20110148755USER INTERFACE APPARATUS AND USER INTERFACING METHOD BASED ON WEARABLE COMPUTING ENVIRONMENT - Provided is a user interface apparatus based on wearable computing environment, which is worn on a user, including: a sensor unit including at least one sensor worn on the user and outputting a plurality of sensing signals according to a positional change of a user's arm or a motion of a user's finger; and a signal processing unit outputting a user command corresponding to the 3D coordinates of the user's arm and the motion of the user's finger from the plurality of sensing signals output from the sensor unit and, controlling an application program running in a target apparatus using the user command.06-23-2011
20110187640Wireless Hands-Free Computing Headset With Detachable Accessories Controllable by Motion, Body Gesture and/or Vocal Commands - A remote control microdisplay device that uses hand movement, body gesture, head movement, head position and/or vocal commands to control the headset, a peripheral device, a remote system, network or software application, such as to control the parameters of a field of view for the microdisplay within a larger virtual display area associated with a host application, a peripheral device or host system. The movement and/or vocal commands are detected via the headset and/or detachable peripheral device connected to the headset microdisplay device via one or more peripheral ports.08-04-2011
20110187638Interactive module applied in 3D interactive system and method - An interactive module applied in a 3D interactive system calibrates a location of an interactive component or calibrates a location and an interactive condition of a virtual object in a 3D image, according to a location of a user. In this way, even the location of the user changes so that the location of the virtual object seen by the user changes as well, the 3D interactive system still can correctly decide an interactive result according to the corrected location of the interactive component, or according to the corrected location and corrected interactive condition of the virtual object.08-04-2011
20110187639DUAL-MODE INPUT DEVICE OPERABLE IN A SELECTED ONE OF A RELATIVE COORDINATE MODE AND AN ABSOLUTE COORDINATE MODE - A dual-mode input device includes a relative coordinate generator disposed in a casing for detecting motion of the casing and for generating relative coordinate information based on detected motion of the casing, and a processing unit. The processing unit includes a coordinate storing module for storing absolute coordinate information, an absolute coordinate generator for generating updated absolute coordinate information based on the relative coordinate information received from the relative coordinate generator and the absolute coordinate information received from the coordinate storing module, and for storing the updated absolute coordinate information in the coordinate storing module, and an output selecting module operable in one of a relative coordinate mode, in which the output selecting module outputs the relative coordinate information, and an absolute coordinate mode, in which the output selecting module outputs the absolute coordinate information.08-04-2011
20100026625CHARACTER INPUT DEVICE - A character input device is disclosed. In one embodiment, the character input device includes an input unit, press detection units, movement detection units, and a control unit. The input unit is provided as a single body such that first directional input, which is performed by pressing one of first direction indication locations arranged radially from a reference location and spaced apart from one another, and second directional input, which is performed through movement from each of the first direction indication locations to one of second direction indication locations arranged radially around the first direction indication location, can be performed. The press detection units detect the first directional input. The movement detection units detect the second directional input. The control unit extracts a character code, assigned to each selected one of the direction indication locations, from a memory unit based on results of the detection.02-04-2010
20110215995INFORMATION PROCESSING DEVICE, METHOD AND PROGRAM - An information processing device includes a capture section capturing an image of an object, an acquisition section acquiring the image captured by the capture section, a calculation section calculating vibration information on the basis of the image acquired by the acquisition section, a determination section determining a vibration command on the basis of the vibration information calculated by the calculation section, and a control section executing predetermined processing on the basis of the vibration command determined by the determination section.09-08-2011
20100026624INTERACTIVE DIRECTED LIGHT/SOUND SYSTEM - An interactive directed beam system is provided. In one implementation, the system includes a projector, a computer and a camera. The camera is configured to view and capture information in an interactive area. The captured information may take various forms, such as, an image and/or audio data. The captured information is based on actions taken by an object, such as, a person within the interactive area. Such actions include, for example, natural movements of the person and interactions between the person and an image projected by the projector. The captured information from the camera is then sent to the computer for processing. The computer performs one or more processes to extract certain information, such as, the relative location of the person within the interactive area for use in controlling the projector. Based on the results generated by the processes, the computer directs the projector to adjust the projected image accordingly. The projected image can move anywhere within the confines of the interactive area.02-04-2010
20090174654COMPUTERIZED INTERACTOR SYSTEMS AND METHODS FOR PROVIDING SAME - A computerized interactor system uses physical, three-dimensional objects as metaphors for input of user intent to a computer system. When one or more interactors are engaged with a detection field, the detection field reads an identifier associated with the object and communicates the identifier to a computer system. The computer system determines the meaning of the interactor based upon its identifier and upon a semantic context in which the computer system is operating. The interactors can be used to control other systems, such as audio systems, or it can be used as intuitive inputs into a computer system for such purposes as marking events in a temporal flow. The interactors, as a minimum, communicate their identity, but may also be more sophisticated in that they can communicate additional processed or unprocessed data, i.e. they can include their own data processors. The detection field can be one-dimensional or multi-dimensional, and typically has different semantic meanings associated with different parts of the detection field.07-09-2009
20090174653METHOD FOR PROVIDING AREA OF IMAGE DISPLAYED ON DISPLAY APPARATUS IN GUI FORM USING ELECTRONIC APPARATUS, AND ELECTRONIC APPARATUS APPLYING THE SAME - An electronic apparatus to provide an area of an image displayed on a display apparatus in a GUI form. The electronic apparatus transfers a user command related to an external apparatus to the external apparatus, and displays an area of an image displayed on the external apparatus on a display. Therefore, it is possible to display an area of an image displayed on a display apparatus in the GUI form using another display apparatus so that the user may select a desired GUI item more conveniently and more intuitively.07-09-2009
20090174652Information processing system, entertainment system, and information processing system input accepting method - A technique is provided related to an input interface wherein entertainment is enhanced by an information processing system that includes means for producing a computer image that prompts a player to virtually touch a plurality of touch points; means for accepting input of a video image of the player captured by image pickup means; display control means for causing a display device to display and superimpose the video image and the computer image on each other; means for analyzing the video image during display of the computer image to detect virtual touches of any of the plurality of touch points; and means for executing predetermined processing when the detecting means detects virtual touches that are performed on a predetermined number of touch points in a predetermined order.07-09-2009
20120146897THREE-DIMENSIONAL DISPLAY - A light ray controller has a circular frustum shape and is fitted in a circular hole of a top board such that its large diameter bottom opening faces upward. The light ray controller transmits a light ray while diffusing the light ray in a ridgeline direction and transmits the light ray straightforward without diffusing the light ray in a circumferential direction. A rotation module is provided under the table. One or more scanning projectors are provided on a circumference around an axis of the light ray controller on the rotation base of the rotation module. One or more scanning projectors are rotated by the rotation module. The controller controls one or more scanning projectors being rotated based on three-dimensional shape data stored in a storage.06-14-2012
20100321289MOBILE DEVICE HAVING PROXIMITY SENSOR AND GESTURE BASED USER INTERFACE METHOD THEREOF - A mobile device has a proximity sensor and a user interface based on a user's gesture detected using the proximity sensor. The gesture-based user interface method includes enabling proximity sensing through the proximity sensor, detecting a specific gesture through the proximity sensing, analyzing a pattern of the specific gesture, and executing a particular function assigned to the pattern.12-23-2010
20090284462COMPUTER WITH VIRTUAL KEYBOARD AND DISPLAY - A computer includes a host (11-19-2009
20090309829SIMPLE MULTIDIRECTIONAL KEY FOR CURSOR CONTROL - A multidirectional key is operative to press a specific one of multiple contacts, depending on the direction wherein the key's button disc is being moved. The fingers are projections extending from a ring, the combination being made out of sheet metal, and bent out of the plane of the disc to engage with the button when the latter is moved. As a result, the multidirectional key has a very simple configuration and is inexpensive to manufacture.12-17-2009
20090295715MOBILE COMMUNICATION TERMINAL HAVING PROXIMITY SENSOR AND DISPLAY CONTROLLING METHOD THEREIN - A mobile communication terminal having a function of detecting a proximity touch and a display controlling method therein are disclosed. The present invention includes a touchscreen configured to display prescribed data, the touchscreen detecting a real touch or a proximity touch to a surface contact point, a proximity sensor outputting a proximity signal corresponding to a proximity position of a proximate object, and a controller controlling an implementation of an operation associated with the prescribed data displayed on the touchscreen according to the proximity signal detected by the proximity sensor.12-03-2009
20120146894MIXED REALITY DISPLAY PLATFORM FOR PRESENTING AUGMENTED 3D STEREO IMAGE AND OPERATION METHOD THEREOF - Various 3D image display devices divide and share a physical space for expressing a 3D image, and real-time contents information is generated based on user information and information on the divided space and displayed together using various 3D image display devices to present a 3D image naturally in a deeper, wider, and higher space. A mixed reality display platform includes an input/output controller controlling display devices including 3D display devices, an advance information manager establishing 3D expression space for each display device to divide or share a physical space by collecting spatial establishment of the display device, and a real-time information controller generating real-time contents information using user information and 3D contents for a virtual space. The input/output controller distributes the real-time contents information to each display device based on the 3D expression spatial information.06-14-2012
20120146891ADAPTIVE DISPLAYS USING GAZE TRACKING - Methods and systems for adapting a display screen output based on a display user's attention. Gaze direction tracking is employed to determine a sub-region of a display screen area to which a user is attending. Display of the attended sub-region is modified relative to the remainder of the display screen, for example, by changing the quantity of data representing an object displayed within the attended sub-region relative to an object displayed in an unattended sub-region of the display screen.06-14-2012
20120306743DYNAMIC THEME COLOR PALETTE GENERATION - There is provided a method of changing a theme for a user interface of a computer system comprising receiving an identification of an image with which to define a color palette of a theme for rendering elements of a user interface on a color display of the computer system; analysing the image to determine at least one predominant color; and defining the color palette in response to the analysis. The image may comprise a background image selected by a user for display by the computer system. Dynamic generation of the color palette matches the user interface to colors to provide flexible and appealing themes. A computer readable memory having recorded thereon instructions to carry out this method is also provided, as well as a device comprising such memory.12-06-2012
20120306742ELECTRONIC DEVICE, IMAGE ACQUISITION EQUIPMENT AND IMAGE ACQUISITION CONTROL METHOD - Embodiments of the present invention provide an electronic device, image acquisition equipment, and image acquisition control method. The electronic deice includes a main board, and an image acquisition equipment and a processor which are connected with the main board, wherein the image acquisition equipment has a first mode and a. second mode, the image acquisition equipment including: a first imaging unit array for image acquisition in the first mode; and a second imaging unit array for image acquisition in the second mode; the first imaging unit array and the second imaging unit array are different; an image acquisition control module is set in the processor, and is used for controlling the image acquisition equipment to switch to the first mode or the second, mode according to a mode switching instruction. The embodiments of the present invention can realize optimization of all applications, and provide optimal images for every application with a lower computational complexity, a lower power consumption caused by computation and a higher processing speed.12-06-2012
201103045302D/3D IMAGE SWITCHING DISPLAY DEVICE - A 2D/3D image switching display device includes an image display unit and an image switching unit coupled to the image display unit. The image switching unit includes first and second transparent substrates and first and second transparent conducting elements installed on the first and second transparent substrates respectively. An electrochromic layer and an electrolytic layer are formed on the first and second transparent substrates sequentially. The electrochromic layer produce a color change according to the switching status of the image display unit After a stereo image divided into left and right eye images is received by naked eyes, no moire pattern will be produced, so that no additional light shielding device using a parallax barrier is required for displaying stereo images, and the 2D/3D image switching display device can change a light-shielding angle for adjusting a stereo image display according to the viewing angle.12-15-2011
20120306741System and Method for Enhancing Locative Response Abilities of Autonomous and Semi-Autonomous Agents - A computer system and method according to the present invention can receive multi-modal inputs such as natural language, gesture, text, sketch and other inputs in order to simplify and improve locative question answering in virtual worlds, among other tasks. The components of an agent as provided in accordance with one embodiment of the present invention can include one or more sensors, actuators, and cognition elements, such as interpreters, executive function elements, working memory, long term memory and reasoners for responses to locative queries, for example. Further, the present invention provides, in part, a locative question answering algorithm, along with the command structure, vocabulary, and the dialog that an agent is designed to support in accordance with various embodiments of the present invention.12-06-2012
20120306740INFORMATION INPUT DEVICE USING VIRTUAL ITEM, CONTROL METHOD THEREFOR, AND STORAGE MEDIUM STORING CONTROL PROGRAM THEREFOR - An information input device that enables a user to input information easily by a single hand. The information input device inputs information using a virtual item displayed on a display unit. An image pickup unit shoots an indicator that operates the virtual item continuously to obtain indicator image data. A display control unit displays an indicator image corresponding to the indicator image data on the display unit. A setting unit sets, when detecting an action of the indicator to an element included in the virtual item displayed on the display unit, information corresponding to the element concerned as input information.12-06-2012
20120306739INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING APPARATUS, STORAGE MEDIUM HAVING INFORMATION PROCESSING PROGRAM STORED THEREON, AND CONTENT PLAYBACK CONTROL METHOD - An example information processing system includes a stationary display device, and a portable display device on which a predetermined input can be made by a user. A content item is played and displayed on the stationary display device, and while the content item is being played, the playback image of the content item and a user interface image used for specifying a content item to be played are selectively displayed on the portable display device.12-06-2012
20120306737GESTURE-BASED PRIORITIZATION OF GRAPHICAL OUTPUT ON REMOTE DISPLAYS - The disclosed embodiments provide a system that drives a remote display from an electronic device. The electronic may be a mobile phone, a tablet computer, a personal digital assistant (PDA), and/or a portable media player. During operation, the system uses the electronic device to obtain user input associated with a transition in graphical output on the electronic device and the remote display. Next, the system identifies a region of interest in the remote display based on the user input and a usage context associated with the graphical output. Finally, the system facilitates viewing of the transition on the remote display by prioritizing transmission of the graphical output from the electronic device to the remote display based on the region of interest.12-06-2012
20120306735THREE-DIMENSIONAL FOREGROUND SELECTION FOR VISION SYSTEM - A method for controlling a computer system includes acquiring video of a subject, and obtaining from the video a time-resolved sequence of depth maps. An area targeting motion is selected from each depth map in the sequence. Then, a section of the depth map bounded by the area and lying in front of a plane is selected. This section of the depth map is used for fitting a geometric model of the subject.12-06-2012
20120306736SYSTEM AND METHOD TO CONTROL SURVEILLANCE CAMERAS VIA A FOOTPRINT - A system includes a video sensing device, a computer processor coupled to the video sensing device, and a display unit coupled to the computer processor. The system is configured to display on the display unit a footprint of the video sensing device in an environment, receive input from a user that directly alters the footprint of the video sensing device, calculate a change in one or more of a pan, a tilt, and a zoom of the video sensing device as a function of the direct alteration of the footprint, alter one or more of the pan, the tilt, and the zoom of the video sensing device as a function of the calculations, and display a field of view of the video sensing device on the display unit as a function of the altered pan, tilt, and zoom of the video sensing device.12-06-2012
20120306734Gesture Recognition Techniques - In one or more implementations, a static geometry model is generated, from one or more images of a physical environment captured using a camera, using one or more static objects to model corresponding one or more objects in the physical environment. Interaction of a dynamic object with at least one of the static objects is identified by analyzing at least one image and a gesture is recognized from the identified interaction of the dynamic object with the at least one of the static objects to initiate an operation of the computing device.12-06-2012
20110304532Image capturing and display apparatus and method - Provided is a display apparatus and method. The display apparatus may sense light reflected from an object and passed through a display panel, and may control a power of a backlight unit depending on whether the light has passed through the display panel.12-15-2011
20110304534WRITING STROKE RECOGNITION APPARATUS, MOBILE TERMINAL AND METHOD FOR REALIZING SPATIAL WRITING - A writing stroke recognition apparatus, a mobile terminal and a method for realizing spatial writing are provided. By acquiring and analyzing the writing movement amount information, the present invention utilizes the preset corresponding relationship between the movement amount information and stroke information, and the preset corresponding relationship between the movement amount information and the stroke relative position information, to obtain the writing stroke information and the writing relative position information. Further, the present invention recognizes the corresponding character by using the obtained writing stroke information and the writing relative position information.12-15-2011
20110304531METHOD AND SYSTEM FOR INTERFACING AND INTERACTION WITH LOCATION-AWARE DEVICES - A system and a method of using the system includes a motion detection subsystem for detecting a motions applied to a computing device about one or more axes. A storage subsystem stores motion command definitions. A motion processing subsystem is included for characterizing the motions, retrieving command definitions, comparing the characterized motions with the retrieved command definitions, and retrieving commands associated with matched command definitions. A command processing subsystem is included for defining new motions and storing new characterized motions as entries in the command definition, retrieving stored characterized motions and storing named characterized motions as entries in the command definitions, associating commands with stored characterized motions and storing the associated commands as entries in the command definitions, and processing retrieved commands for modification of and interaction with displayed information of the computing device and saving processing results.12-15-2011
20090033617Haptic User Interface - It is presented a method comprising: generating at least one haptic user interface component using an array of haptic elements; detecting user input applied to at least one haptic element associated with one of said at least one haptic user interface component; and executing software code associated with activation of said one of said at least one user interface component. A corresponding apparatus, computer program product and user interface are also presented.02-05-2009
20090295713POINTING DEVICE WITH IMPROVED CURSOR CONTROL IN-AIR AND ALLOWING MULTIPLE MODES OF OPERATIONS - Cursor resolution of a device is based upon a user's gripping (or squeezing) of the device in one embodiment, in accordance with a user's natural usage patterns. In one aspect, a device in accordance with an embodiment of the present invention offers multiple modes of operation depending on its orientation (e.g., which side of the device is facing upward). A device in accordance with an embodiment of the present invention can be used as a mouse, a presentation device, a keyboard for text entry, and so on. In one aspect of the present invention, circular gesture based controls are implemented, specifically for repetitive type functions.12-03-2009
20110304533STEREOSCOPIC IMAGE DISPLAY DEVICE - Disclosed is a stereoscopic image display device which detects movement of moving viewers from among multiple viewers and enables the multiple viewers to observe a stereoscopic image even if the moving viewers change positions, the stereoscopic image display device includes a display panel corresponding to one switchable region to emit two-dimensional images, the number of which is more than the number of N views (N being a natural number over 3), a switchable panel located on the display panel to convert the two-dimensional images into three-dimensional images and to emit the three-dimensional images when voltage is applied thereto, a detection unit to detect movement of moving viewers from among multiple views and final positions of the moving viewers, and a control unit to output a control signal to shift the views of the two-dimensional images according to the movement and the final positions of the moving viewers.12-15-2011
20100225579IMAGE SENSOR AND OPTICAL POINTING SYSTEM - Provided is an optical pointing system. The optical pointing system has an image sensing unit for receiving light and generating a plurality of first analog signals corresponding to a quantity of the received light in response to a reset signal and a pixel selection signal, at least one pixel for shutter control for receiving light, each of the pixel for shutter control generates a second analog signal corresponding to a quantity of the received light, a first comparator for comparing voltages of the plurality of first analog signals to generate a first comparison digital signal in response to a shutter control signal, an image processor for receiving the first comparison digital signal to calculate a movement value of the optical pointing system, and a shutter control unit for generating the shutter control signal based on the at least second analog signal.09-09-2010
20110032182PORTABLE TERMINAL HAVING PLURAL INPUT DEVICES AND METHOD FOR PROVIDING INTERACTION THEREOF - A portable terminal having plural input devices and a method for providing interaction thereof are provided. A method for providing interaction of a portable terminal having plural directional input devices, includes: receiving an interaction signal input from at least of the plural direction input devices; checking drive modes of the portable terminal; and executing a preset function by the drive modes corresponding to the interaction signal. The method may easily execute various functions through intuitive interaction using plural input devices thereby improving convenience for a user.02-10-2011
20100245237Virtual Reality Environment Generating Apparatus and Controller Apparatus - The present invention provides a virtual reality environment generating apparatus including a non-base type interface configured to allow a user to haptically touch virtual objects and game characters. The virtual reality environment generating apparatus includes a content creating device 09-30-2010
20110109541Display control device for remote control device - A display control device for a remote control device is disclosed. The display control device is configured to have connection with a display device and a remote control device spaced apart from each other. The display device is configured to display an operation screen having multiple icons for accepting an operation directed to a control target apparatus. When display is switched from a first operation screen to a second operation screen having a different icon arrangement, the display control means shifts an icon selection state from one icon selected on the first operation screen to an initial selection icon pre-set on the second operation screen and causes the display device to display a visual effect indicative of a direction from position of the selected one icon to position of the initial selection icon.05-12-2011
20120038549Deriving input from six degrees of freedom interfaces - The present invention relates to interfaces and methods for producing input for software applications based on the absolute pose of an item manipulated or worn by a user in a three-dimensional environment. Absolute pose in the sense of the present invention means both the position and the orientation of the item as described in a stable frame defined in that three-dimensional environment. The invention describes how to recover the absolute pose with optical hardware and methods, and how to map at least one of the recovered absolute pose parameters to the three translational and three rotational degrees of freedom available to the item to generate useful input. The applications that can most benefit from the interfaces and methods of the invention involve 3D virtual spaces including augmented reality and mixed reality environments.02-16-2012
20120038548HANDHELD FIELD MAINTENANCE DEVICE WITH IMPROVED USER INTERFACE - A handheld field maintenance tool is provided. The handheld field maintenance tool includes a process communication module configured to communicate with a field device. The handheld field maintenance tool also includes a display and a user input device. A controller is coupled to the process communication module, the user input device and the display and is configured to generate a listing of task-based field maintenance operations on the display and receive a user input selecting a task-based field maintenance operation. The controller is configured to automatically traverse a menu of the field device using a fast-key sequence relative to the selected task. A method of creating a task-based field maintenance operation is provided. A method of interacting with a field device menu is also provided.02-16-2012
20120038547METHOD AND APPARATUS FOR USER INTERFACE COMMUNICATION WITH AN IMAGE MANIPULATOR - A system, and method for use thereof, for image manipulation. The system may generate an original image in a three dimensional coordinate system. A sensing system may sense a user interaction with the image. The sensed user interaction may be correlated with the three dimensional coordinate system. The correlated user interaction may be used to project an updated image, where the updated image may be a distorted version of the original image. The image distortion may be in the form of a twisting, bending, cutting, displacement, or squeezing. The system may be used for interconnecting or communicating between two or more components connected to an interconnection medium (e.g., a bus) within a single computer or digital data processing system.02-16-2012
20120038546GESTURE CONTROL - An apparatus includes sensor circuitry configured to sense spatial phenomena; application circuitry configured to respond to sensation of a spatial phenomenon by the sensor circuitry where a pre-existing relationship exists between the spatial phenomenon and the response of the application circuitry; and regulation circuitry configured to regulate the response of the application circuitry to the spatial phenomenon based at least in part on sensation of a different spatial phenomenon by the sensor circuitry. Various other apparatuses, systems, methods, etc., are also disclosed.02-16-2012
20110050565COMPUTER SYSTEM AND CONTROL METHOD THEREOF - A computer system and a method for controlling the same, even while the case of the electronic device is closed by making the device capable of using various functions by a simple and convenient input method. The computer system comprises at least one device unit; a housing in which the device unit(s) is provided; a motion sensing unit configured to sense a user motion taken at an outer side of the housing; and a controller configured to control the device unit(s) to perform an operation corresponding to the user motion sensed by the motion sensing unit.03-03-2011
20110050564Dynamic Picture Frame in Electronic Handset - A portable electronic device configured to operate in a image presentation mode that presents a sequence of images on a display component. A controller is configured to determine a context of the portable electronic device and to vary a presentation time period of images in a subset of images in the sequence relative to the presentation time period of images not in the subset, wherein the images in the subset are associated with the context of the portable electronic device.03-03-2011
20110050563METHOD AND SYSTEM FOR A MOTION COMPENSATED INPUT DEVICE - A method and system for a motion compensated input device are provided. The motion compensated input device includes an input device configured to receive a physical input from a user and convert the physical input into a physical input signal representative of the physical input, a motion sensing device configured to sense acceleration forces of at least one of the input device and the user, the acceleration forces introducing an error into the physical input, and an input compensator configured to adjust the physical input signal using the acceleration forces to generate a compensated input signal representative of the physical input.03-03-2011
20100231504HOTSPOTS FOR EYE TRACK CONTROL OF IMAGE MANIPULATION - The invention relates to a system for manipulating images or graphics where the manipulation to be performed are selected or activated using eye-tracking. A number of hotspots are arranged around and bordering a region of interest (ROI), which are each, associated with an image parameter adjustment (e.g. contrast or brightness). An eye-tracking device determines a fixation point of the user and the system automatically performs the associated adjustment whenever the fixation point falls within a hotspot. The close bordering of the hotspots around the ROI allows the user to keep focus on the ROI and only slightly shift his/her gaze to or towards a hotspot for adjusting an image parameter. Thereby, the adjustment can be controlled while still keeping the ROI in focus so that the effect of the adjustment can be followed simultaneously.09-16-2010
20120038550SYSTEM ARCHITECTURE AND METHODS FOR DISTRIBUTED MULTI-SENSOR GESTURE PROCESSING - The techniques discussed herein contemplate methods and systems for providing, for example, interactive virtual experiences that are initiated or controlled using user gestures. In embodiments, the techniques provide for gestures performed by users holding devices to be recognized and processed in a cloud computing environment such that the gestures produce a predefined desired result. According to one embodiment, a server communicates with a first device in a cloud computing environment, wherein the first device can detect surrounding devices, and an application program is executable by the server, wherein the application program is controlled by the first device and the output of the application program is directed by the server to one of the devices detected by the first device.02-16-2012
20100033425COMPUTER PERIPHERAL DEVICE INTERFACE FOR SIMULATING MOUSE, KEYBOARD, OR GAME CONTROLLER ACTION AND METHOD OF USE - A computer peripheral device interface includes a microprocessor and unique coding /USB functionality to enable a foot, feet, hands, body computer peripheral device to have the capability to function as each of programmable mouse, joystick, and/or keyboard. The inventive peripheral device interface can include a communications channel to support various custom peripherals to be linked from or to the platform or other computer peripheral device, i.e. platform, pad, cycle, balance apparatus.02-11-2010
20120206330MULTI-TOUCH INPUT DEVICE WITH ORIENTATION SENSING - A multi-touch orientation sensing input device may enhance task performance efficiency. The multi-touch orientation sensing input device may include a device body that is partially enclosed or completely enclosed by a multi-touch sensor. The multi-touch orientation sensing input device may further include an inertia measurement unit that is disposed on the device body, The inertia measurement unit may measures a tilt angle of the device body with respect to a horizontal surface, as well as a roll angle of the device body along a length-wise axis of the device body with respect to an initial point on the device body.08-16-2012
20120206336DETECTOR FOR OPTICALLY DETECTING AT LEAST ONE OBJECT - A detector (08-16-2012
20100123655Optical trace detecting module - An optical trace detecting module is disposed in a computer input device. The computer input device is able to displace on a plane relatively, and has a light-pervious plate used for an object to contact with and move on a surface thereof. The optical trace detecting module includes a circuit board, a first projection set, a second projection set, and at least one optical path diverting element. An optical sensor is electrically disposed on the circuit board. The first projection set is opposite to the light-pervious plate. The second projection set is opposite to the plane. The optical path diverting element is disposed between the two projection sets and the optical sensor, so as to direct the sensing beams emitted by the two projection sets to reach the optical sensor, thereby generating corresponding control signals.05-20-2010
20080303782METHOD AND APPARATUS FOR HAPTIC ENABLED FLEXIBLE TOUCH SENSITIVE SURFACE - A method and apparatus for an electronic interactive device having a haptic enabled flexible touch sensitive surface are disclosed. In one embodiment, the electronic interactive device includes a flexible touch sensitive surface, a flexible screen (or display), and an actuator. The flexible touch sensitive surface is deposited over the flexible screen and is capable of receiving an input, such as, for example, from a user. The flexible screen displays an image via a displaying window. The actuator is coupled to the flexible screen and provides haptic feedback in response to the input.12-11-2008
20110316768SYSTEM, METHOD AND APPARATUS FOR SPEAKER CONFIGURATION - An application for a device includes at least two speakers and at least two-channels of audio. The at least two-channels of audio are selectively routed to the at least two speakers. Routing of the audio to the speakers is made based upon an orientation of the device. The orientation of the device and therefore the speaker configuration is either manually changed by a viewer input or is automatically detected such as when a hand-held device is rotated, for example, to view a display of the device in portrait mode instead of landscape mode.12-29-2011
20110316769PROVIDING AN ALTERNATIVE HUMAN INTERFACE - Providing an alternative human interface for an electronic device when a current human interface is made ineffective by at least an environmental factor is described herein. By ineffective it is meant that the current human interface cannot maintain a minimum level of interactivity between a user and the electronic device in the current or anticipated environment. In addition to maintaining at least a threshold level of interactivity, the configuration of the alternative human interface can take into consideration other factors such as an expected operating state of the electronic device affected by the choice of alternative human interface.12-29-2011
20110316767SYSTEM FOR PORTABLE TANGIBLE INTERACTION - Embodiments of the invention describe a system utilizing at least one camera and a display to create an object and context aware system. Embodiments of the invention may utilize the camera to sense a system's surroundings and use recognition logic or modules to detect and recognize objects on and around the system. System applications may further act on the sensed data and use the display of the system to provide visual feedback and interactive elements as a means to interact with the system user.12-29-2011
20090021476INTEGRATED MEDICAL DISPLAY SYSTEM - The invention relates to a medical display system for performing a medical function, including: an image display unit configured for displaying a medical image data set, and an additional device, wherein the additional device is integrated with said image display unit and is configured to assist in performing the medical function of the medical display system.01-22-2009
20120044137SCREEN CAPTURE - A screen capture system (02-23-2012
20120044138METHOD AND APPARATUS FOR PROVIDING USER INTERACTION IN LASeR - A method and an apparatus for providing user interaction are provided. The apparatus for providing user interaction includes an input unit configured to receive control by a user; a control processing unit configured to analyze the control and generate drag event information including event type information indicating a type of the control and event attribute information; and an action processing unit configured to generate drag element information for showing an action corresponding to the control on a display. The drag element information includes action mode information indicating a mode of the action and action attribute information. The proposed method and apparatus make it possible to apply various data formats defined by existing standard specifications to other standard specifications and interaction devices.02-23-2012
20120044136DISPLAY DEVICE AND CONTROL METHOD THEREOF - Disclosed with are a display device and a control method thereof. The method includes recognizing a first motion of a user; and assigning a control right to perform functions of the display devices through motions to the user in response to the recognized first motion.02-23-2012
20120044135REMOTE CONTROL FOR ELECTRONIC READER AND REMOTE CONTROL METHOD - A remote control comprises a button generating control signals; a transmission unit configured to transmit the control signals to an electronic reader; a microprocessor unit configured to analyze an operation type according to the control signals and generate operation signals corresponding to the operation type to signal the electronic reader to flip page. A remote control method applied in a remote control of an electronic reader is also provided.02-23-2012
20120062454Information Processing System - Provided is an information processing system which includes: a communication device including a sensor that measures information regarding a posture thereof; and an information processing device. The information processing device displays a guide image for causing a user to perform a rotational operation for rotating the communication device, and executes calibration of the sensor by using the measurement result of the sensor acquired from the communication device while the guide image is displayed. The guide image includes an image representing a reference axis to be a rotation center for the rotational operation and an image representing the communication device, and the image representing the communication device is located within the guide image so that a portion of the communication device corresponding to a position of the sensor overlaps with the reference axis.03-15-2012
20120001846INPUT DEVICE, WEARABLE COMPUTER, AND INPUT METHOD - A low-cost input device suitable for a wearable computer is provided. An input method suitable for operating a wearable computer is provided. A wearable computer including the input device is provided. An input device 01-05-2012
20120001843Mobile Device User Interface Change Based On Motion - Adapting a user interface of a mobile computing device when the mobile computing device is in a motion state is provided. Upon detecting that a mobile computing device is in motion by utilization of a location or motion determining system, such as a GPS navigation and/or accelerometer system, a motion mode UI may be activated on the device, wherein a display of device functionalities may be simplified by modifying one or more displayed elements of the device user interface.01-05-2012
20120001844Remote Control Systems and Methods for Activating Buttons of Digital Electronic Display Devices - A method for controlling an e-book reader, set forth by way of example and not limitation, includes transmitting a digital packet including an address of a button actuator and a button control signal in response to a detection of a button press on a remote control device. The method further includes receiving the packet at the button actuator, decoding the packet in a digital processor to derive the button control signal, and controlling a motor to move a physical actuator between a neutral position and a button press position.01-05-2012
20120001845System and Method for Virtual Touch Sensing - In view of existing mobile devices which have the limitation of relatively small area of the touch screen, the present invention describes a virtual touch sensing method based on computer vision technology. The method includes the steps of using more than one sensor to detect the coordinates of an indicator in a virtual touching area, and calculating the respective screen coordinates according to the coordinates of the indicator, where the area of the operation surface of the virtual touching area is independent to the area of the screen. The present invention also disclosed a corresponding virtual touch sensing system which provides a predictive control interface, where the area of the control interface is independent to the area of the actual screen.01-05-2012
20120001847Driving Method of Input/Output Device - A method for driving an input/output device, including: generating first data by putting a first region of a light unit in a lighted condition and a second region of the light unit in the lighted condition; generating second data by putting the first region in the lighted condition and the second region in an unlighted condition; generating third data by putting the first region in the unlighted condition and the second region in the lighted condition; generating fourth data by putting the first region in the unlighted condition and the second region in the unlighted condition; and generating difference data of either the first data or the third data and either the second data or the fourth data by using a data processor.01-05-2012
20100053069MOBILE COMPUTING SYSTEM FACILITATING ADAPTIVE DISPLAY OF CONTENT AMONG A PLURALITY OF DISPLAY COMPONENTS INCLUDING AT LEAST ONE VIRTUAL IMAGE DISPLAY COMPONENT - Systems, devices, and/or methods that facilitate adaptive display of content among a plurality of display components including at least one virtual image display component are presented. The disclosed subject matter facilitates forming determinations or inferences based on various metrics for adaptively routing content to select display devices. The display interface component, at least in part, routes content selectively between a primary display component and a virtual image display (VID) component. This can better optimize the use of VID components to avoid overstimulation of a user while allowing access to the benefits of the VID component under predetermined conditions.03-04-2010
20110095976MOBILE TERMINAL - A mobile terminal includes a connector to moveably connect a first body to a second body and a controller to change information displayed on a screen based on movement of the second body relative to the first body. The connector allows the second body to move along a first axis relative to the first body and along a second axis that crosses the first axis relative to the first body.04-28-2011
20120056803CONTENT OUTPUT SYSTEM, OUTPUT CONTROL DEVICE AND OUTPUT CONTROL METHOD - Based on captured images obtained when an image capture device takes images in an image capture range, a change in the moving speed of one passerby or each of a plurality of passerby contained in the captured images is calculated. When there is a passerby whose moving speed is decreased in the image capture range, the passerby is paying attention to the content, and therefore content to be outputted by a content output device is switched from ordinary content to specific content. On the other hand, when only passerby who are moving at constant speed are present in the image capture range, the passerby are not paying attention to the content, and hence the content output device continues to output the ordinary content.03-08-2012
20120007800INTERACTIVE GLASSES SYSTEM - An interactive glasses system comprising a frame (01-12-2012
20120007796Flexible Apparatus - An apparatus including an elongate structure including integrated electronic circuitry providing at least an electronic user interface wherein the elongate structure is flexible and is configured to be flexed lengthwise by a user to form a looped configuration in which the elongate structure forms at least one lengthwise loop about an axis and in which at least one electrical connection for the electronic circuitry is formed where a first portion of the elongate structure and a second portion of the elongate structure contact.01-12-2012
20120007801EASILY DEPLOYABLE INTERACTIVE DIRECT-POINTING SYSTEM AND PRESENTATION CONTROL SYSTEM AND CALIBRATION METHOD THEREFOR - A method for controlling movement of a computer display cursor based on a point-of-aim of a pointing device within an interaction region includes projecting an image of a computer display to create the interaction region. At least one calibration point having a predetermined relation to said interaction region is established. A pointing line is directed to substantially pass through the calibration point while measuring a position of and an orientation of the pointing device. The pointing line has a predetermined relationship to said pointing device. Movement of the cursor is controlled within the interaction region using measurements of the position of and the orientation of the pointing device.01-12-2012
20120007799APPARATUS, METHOD FOR MEASURING 3 DIMENSIONAL POSITION OF A VIEWER AND DISPLAY DEVICE HAVING THE APPARATUS - Disclosed herein are an apparatus, a method for measuring 3 dimensional positions of a viewer and a display device having the apparatus. The apparatus for measuring the 3 dimensional positions includes an image capturing module that photographs images included in objects; a detecting module that detects the objects from images photographed by the image capturing module and calculates sizes and coordinates of the images on the objects; and a position calculation module that calculates the 3-dimensional positions of the objects in the space in which the objects are positioned by using the information on the calculated sizes and coordinates of the image of the objects.01-12-2012
20120007797HUMAN-MACHINE INTERFACE - A human interface suitable for being held is provided. The human interface includes a first housing, a second housing and a rotating shaft. The first housing has a first sidewall and a sliding chunk, and the sliding chunk is located at a side of the first sidewall. The second housing having a second sidewall and a sliding trough is disposed under the first housing, wherein the sliding trough is located at one side of the second sidewall. The sliding chunk is disposed in the sliding trough. The rotating shaft is pivotally connected between the first housing and the second housing and located at a side of the first housing apart from the sliding chunk. The sliding chunk of the first housing slides in the sliding trough of the second housing, and the first housing rotates relative to the second housing by using the rotating shaft as a rotating center.01-12-2012
20120206341Presenting Information on a Card with a Passive Electronic Paper Display - With a card including a passive electronic paper display configured to display a visual image, a method of presenting information on a card includes selectively changing a visual image displayed on the passive electronic paper display to update information represented by the visual image, and wherein the card is sized to be carried by a user.08-16-2012
20110018793Mobile Device Customizer - A method and system for customizing a mobile host device is disclosed. An accessory device for interfacing with and customizing a mobile host device includes a communication channel designed to establish a bi-directional communication link between the accessory device and the host device. The accessory device also includes a processor communicatively coupled to the communication channel. The processor is designed to execute a plurality of applications. In addition, the accessory device includes an input assembly communicatively coupled to the processor. The input assembly is designed to minimize a total number of input elements included in the input assembly. Further, at least a first input element being selectively mapped to one or more input functions of the host device based on a user selection.01-27-2011
20090167678SYSTEM AND METHOD FOR NAVIGATING A MOBILE DEVICE USER INTERFACE WITH A DIRECTIONAL SENSING DEVICE - An electronic mobile device includes a display, a tilt sensor and a processor. The display is for displaying a graphical element. The tilt sensor is configured to measure a tilt angle of the mobile device. The processor is configured to store the measured tilt angle as a reference tilt angle, subsequently determine a delta tilt angle as the difference between a currently measured tilt angle and the reference tilt angle, compare the delta tilt angle to different thresholds, and alter the position of the displayed element on the display at a rate that is based on the number of the thresholds the delta tilt angle has exceeded.07-02-2009
20120013528Distance measurment module, display device having the same, and distance measurement method of display device - A distance measurement module includes: an image pickup lens capturing an image of a subject; a light source unit disposed to be adjacent to the image pickup lens and irradiating reference light to the subject; a light receiving unit extracting image information of the subject and distance information to the subject upon receiving light which is reflected from the subject and made incident through the image pickup lens; and a calculation unit calculating the distance to the subject by using a phase difference between the reference light and the reflected light.01-19-2012
20120013529GESTURE RECOGNITION METHOD AND INTERACTIVE INPUT SYSTEM EMPLOYING SAME - A gesture recognition method comprises capturing images, processing the images to identify at least two clusters of touch points associated with at least two pointers, recognizing a gesture based on motion of the clusters, and updating a display in accordance with the recognized gesture.01-19-2012
20120056804Apparatus, Methods And Computer Program Products Providing Finger-Based And Hand-Based Gesture Commands For Portable Electronic Device Applications - A method includes executing a gesture with a user-manipulated physical object in the vicinity of a device; generating data that is descriptive of the presence of the user-manipulated object when executing the gesture; and interpreting the data as pertaining to at least one object, such as an object displayed by the device.03-08-2012
20120056802Program, Object Control Method, And Game Device - A behavior table storage unit stores correspondence between a predetermined action of an object and a condition for operation of an input device. A condition determination unit determines whether control information of the input device meets the condition for operation stored in the behavior table storage unit. An object control unit causes, when the condition for operation is determined to be met, the object to perform an action mapped to the condition for operation. The behavior table storage unit stores a condition for operation requiring that the input device be moved by a predetermined amount within a predetermined period of time, and the condition determination unit measures time elapsed since the start of movement of the input device and determines, when the input device is moved by the predetermined amount within the predetermined period of time, that the condition for operation is met.03-08-2012
20120056801METHODS AND APPARATUSES FOR GESTURE-BASED USER INPUT DETECTION IN A MOBILE DEVICE - Methods and apparatuses are provided that may be implemented in a mobile device to: determine whether the mobile device is in a gesture command input ready state based, at least in part, on a display portion of the mobile device remaining in a horizontal viewable position for a threshold period of time; with the mobile device in a gesture command input ready state, determine whether a detected movement of the mobile device represents a gesture command input; and in response to the determined gesture command input, affect a user perceivable output.03-08-2012
20120056800SYSTEM FOR FAST, PROBABILISTIC SKELETAL TRACKING - A system and method are disclosed for recognizing and tracking a user's skeletal joints with a NUI system. The system includes one or more experts for proposing one or more skeletal hypotheses each representing a user pose within a given frame. Each expert is generally computationally inexpensive. The system further includes an arbiter for resolving the skeletal hypotheses from the experts into a best state estimate for a given frame. The arbiter may score the various skeletal hypotheses based on different methodologies. The one or more skeletal hypotheses resulting in the highest score may be returned as the state estimate for a given frame. It may happen that the experts and arbiter are unable to resolve a single state estimate with a high degree of confidence for a given frame. It is a further goal of the present system to capture any such uncertainty as a factor in how a state estimate is to be used.03-08-2012
20090066640DEVICE AND METHOD FOR PROVIDING A USER INTERFACE - In an information processing apparatus, an operation receiving section receives an operation performed on a display. In response to the operation to call up a setting screen, a menu information generating section generates menu information in which a setting button requesting a setting conflicting with the current state is displayed in a highlighted mode. In response to the operation of pressing the highlighted setting button, an influence information generating section generates influence information on the current state conflicting with the setting to be made by pressing the setting button. In addition, a display control section controls the display of a setting screen, including the menu information, and also controls the display of a setting screen, including the influence information when the influence information is generated.03-12-2009
20090051649Wearable display interface client - A wearable display interface that during operation is in short-range, wireless bidirectional communication with a server, related methods, and systems including the interface and server. The wearable interface includes a power supply driving a processor that is operatively coupled to a visual display and a wireless interface where the wearable display interface further includes an attachment interface for linking the display interface to a user's body, such as wrist or neck, or apparel. The server is preferably a mobile telephone where the display interface is a client. In addition, the interface can further function as a speaker and/or microphone to received and/or accept verbalized data, such as input instructions or communication commands. Intentional distribution of processing across the server and client ensures that the interface maintains a small and efficient form factor.02-26-2009
20090066641Methods and Systems for Interpretation and Processing of Data Streams - Methods and systems for interpreting and processing data streams from a plurality of sensors on a motion-capture device are described. In various embodiments, an engine module of the system receives a raw input data stream comprising motion and non-motion data. Metadata is associated with data segments within the input data stream to produce a stream of data profiles. In various embodiments, an interpreter converts received data profiles into non-contextual tokens and/or commands recognizable by an application adapted for external control. In various embodiments, a parser converts received non-contextual tokens into contextual tokens and/or commands recognizable by an application adapted for external control. In various embodiments, the system produces commands based upon the non-contextual and/or contextual tokens and provides the commands to the application. The application can be a video game, software operating on a computer, or a remote-controlled apparatus. In various aspects, the methods and systems transform motions and operation of a motion-capture device into useful commands which control an application adapted for external control.03-12-2009
20100013762USER DEVICE FOR GESTURE BASED EXCHANGE OF INFORMATION, METHODS FOR GESTURE BASED EXCHANGE OF INFORMATION BETWEEN A PLURALITY OF USER DEVICES, AND RELATED DEVICES AND SYSTEMS - A user device is disclosed comprising; 01-21-2010
20110080338A Display Device - The present invention relates to a display device comprising a battery (04-07-2011
20110080337IMAGE DISPLAY DEVICE AND DISPLAY CONTROL METHOD THEREOF - Disclosed is an image display device capable of recognizing a hand of a user and predefining an operating region. Image recognition means recognizes the position of a hand of the user. Operating region setup means predefines the operating region, on an imaging region plane of imaging means and around a position on which the user's hand is projected, for the purpose of enabling the user to issue instructions to the image display device. When the position on which the user's hand is projected moves and comes close to the periphery of the operating region, the operating region setup means moves the operating region in the direction of the movement of the position on which the user's hand is projected. Further, the image recognition means recognizes a hand-waving motion of the user, whereas the operating region setup means sets the size of the operating region in accordance with the magnitude of the user's hand-waving motion. Consequently, the image display device provides increased ease of operation and makes it possible to define the operating region in accordance with a user's intention without imposing a significant processing load on itself.04-07-2011
20110080336Human Tracking System - An image such as a depth image of a scene may be received, observed, or captured by a device. A grid of voxels may then be generated based on the depth image such that the depth image may be downsampled. A background included in the grid of voxels may also be removed to isolate one or more voxels associated with a foreground object such as a human target. A location or position of one or more extremities of the isolated human target may be determined and a model may be adjusted based on the location or position of the one or more extremities.04-07-2011
20110080335BEDROOM PROJECTOR WITH CLOCK AND AMBIENT LIGHT CONTROL FEATURES - The brightness of a bed-mounted projector that projects images on a wall or ceiling is established in proportion to the ambient light level in the room. A motion detector can be used to operate room lights in the room only at user-defined times. Keystone correction of the image can be made based on the projection angle as sensed by an orientation sensor. An input strip on the headboard centerline can be used as a control input device.04-07-2011
20120206343MULTI-SOURCE PROJECTION-TYPE DISPLAY - A display device capable of displaying a plurality of projection images is provided. The display device includes a light source within a base and a plurality of projection outputs. Each projection output comprises an optical modulation device and a projection lens system. The light source includes a switch and a plurality of light sources such as lasers or LEDs with different color to one another. The switch receives and diverts light beams from the light sources in a predetermined sequential order to the plurality of projection outputs.08-16-2012
20120206338OPERATIONAL INPUT DEVICE - An operational input device that outputs a signal corresponding to a displacement amount of an operational input, includes a coil annularly extending from a first side toward a second side; a core configured to vary the inductance of the coil by being moved within the coil along an axis of the coil by the operational input applied from the first side toward the second side; and a yoke provided at an end surface of the coil at the second side and provided with an opening at a position facing an end surface of the core at the second side.08-16-2012
20120206332METHOD AND APPARATUS FOR ORIENTATION SENSITIVE BUTTON ASSIGNMENT - Methods and apparatus are provided for orientation sensitive configuration of a device. In one embodiment a method includes detecting a change in orientation of a device, and configuring a display of the device based on the change in orientation. The method may further include assigning operation of one or more buttons of the device based on the change in orientation, wherein a button of the device is assigned one or more functions not previously assigned to the button. The method may be performed by one or more of portable electronic device, media player, eReader, personal communication device, handheld computing device, imaging device, tablet computing device, imaging device, gaming device and computing devices in general.08-16-2012
20120206331Methods and Systems for Supporting Gesture Recognition Applications across Devices - Embodiments herein disclose methods and systems for supporting gesture recognition capable applications to be supported across gesture recognition capable devices. Further disclosed are the marketplace, and network mechanisms to deliver applications and advertisements to multiple devices. A middleware is provided for a gesture recognition device that will provide the common API for a gesture recognition capable application. The same application may be used or played on any other device hosting the supported middleware specific to the device. Developers of applications are provided with a single SDK. The middleware also hosts a network application that will enable a user to search and download supported applications, provide feedback to the network, and provide configuration capabilities.08-16-2012
20120206329OPTICAL NAVIGATION MODULE WITH ALIGNMENT FEATURES - An optical navigation module comprising a rigid flange having a top surface. An optical navigation unit can be coupled to the top surface of the rigid flange with an electrical connection electrically coupled to the optical navigation unit. An alignment flange can be coupled to the rigid flange with the alignment feature including one or more alignment features. The alignment feature can be a hole adapted to receive an alignment pin to hold and align the optical navigation module.08-16-2012
20090135135RECORDING AND REPRODUCING APPARATUS - A recording and reproducing apparatus includes: a recording means for storing a plurality of images in groups; a display means for displaying images stored in the recording means; a detecting means for detecting a part of a human body or an object in a predetermined form; and a display switching means for switching images to be displayed on the display means in accordance with a form of a part of a human body or a form of an object detected by the detecting means.05-28-2009
20090135134EDUCATION METHOD AND SYSTEM INCLUDING AT LEAST ONE USER INTERFACE - An education method and system including a user interface. The user interface may illustrate a portion of the human anatomy that may be viewed in three dimensions. The user interface and method may also provide for selecting training modules, inputting student names, access codes, and tracking pre- and/or post-training student knowledge via quizzes and tests, receive curriculum concurrence from school or government administrators, receive permission from parents or guardians to the actual training and/or provide general visiting users with a holistic overview of the user interface and method of providing the education.05-28-2009
200901351333D Motion Control System and Method - There is disclosed a control system which may included a seat having a pelvis angle sensor to provide positional information indicative of an operator's pelvis angle in a frontal plane. The positional information provided by the pelvis angle sensor may be used to control, at least in part, a device.05-28-2009
20090135132Electronic display system to show large size visual content on a small size display device - A portable display system for an electronic device is invented, which consists of a portable display and a pair of eye glasses. The size of the display is small for easy carrying, for example, the display can fit in to a hand. The resolution of the display is changeable, for example, it can be lower than one hundred by one hundred for normal cell phone usage. It can be as high as thousands by thousands to show HDTV, movie, and web browsing. When the display is in its low resolution mode, naked eyes can see the display content without problem. When the display is in its high resolution mode, the pair of glasses, which enlarges the display content a few times, is needed. There may be, depending of application, multiple resolution modes between the lowest and the highest.05-28-2009
20120306738IMAGE PROCESSING APPARATUS CAPABLE OF DISPLAYING OPERATION ITEM, METHOD OF CONTROLLING THE SAME, IMAGE PICKUP APPARATUS, AND STORAGE MEDIUM - An image processing apparatus which is capable of preventing the display of an operation item, such as an icon, from hindering user's viewing an image which the user desires to view. A distance and position-detecting section and a system controller detect a distance between a screen and an operation element. The system controller displays at least one operation candidate item on the screen when the detected distance becomes not larger than a first threshold distance. Further, when the detected distance becomes not larger than the first predetermined threshold distance, the system controller changes a display form of the operation candidate item depending on the detected distance and displays the operation candidate item as an operable item.12-06-2012
20120154270DISPLAY DEVICE - Disclosed is a display device provided with a display panel (06-21-2012
20120154271METHOD, MEDIUM AND APPARATUS FOR BROWSING IMAGES - A method, medium and apparatus browsing images is disclosed. The method browsing images includes sensing acceleration imparted to a portable digital device, and moving an image onto a display area in accordance with a tilt angle of the portable digital device if the sensed acceleration is greater than a first threshold value.06-21-2012
20120154264USER INTERFACES FOR DIALYSIS DEVICES - In general, a dialysis device includes a first processing device for monitoring dialysis functions of the dialysis device, a second processing device, a display device, and memory. The memory is configured to store instructions that, when executed, cause the dialysis device to provide, on the display device, a first display region and a second display region, where the first display region is associated with the first processing device and the second display region is associated with the second processing device. At least a portion of the first display region cannot be obscured by the second display region.06-21-2012
20120206333VIRTUAL TOUCH APPARATUS AND METHOD WITHOUT POINTER ON SCREEN - Provided is a virtual touch apparatus and method for remotely controlling electronic equipment having a display surface. The virtual touch apparatus includes a three-dimensional coordinate calculator and a controller. The three-dimensional coordinate calculator extracts a three-dimensional coordinate of a user's body. The controller includes a touch location calculation unit and a virtual touch processing unit. The touch location calculation unit calculates a contact point coordinate where a straight line connecting between a first spatial coordinate and a second spatial coordinate meets the display surface using the first and second spatial coordinates received from the three-dimensional coordinate calculator. The virtual touch processing unit creates a command code for performing an operation corresponding to the contact coordinate received from the touch location calculation unit and inputs the command code into a main controller of the electronic equipment.08-16-2012
20120206339CONTROL USING MOVEMENTS - A movement of an object is recognised as a predetermined movement, by transmitting signals between transmitter-receiver pairs, which are reflected from the object. A first event is recorded for one of the transmitter-receiver pairs if a reflected signal meets a predetermined proximity criterion, and a second event is recorded for a second transmitter-receiver pair if after the first event, a subsequent reflected signal meets a predetermined proximity criterion. The first and second events are used to identify the movement.08-16-2012
20120062453GESTURE CONTROL SYSTEM - A method for controlling a device comprises: a) providing a mobile device comprising a camera; b) positioning said mobile device such that said camera acquires the image of an operator's hands; c) analyzing the movements of said operator's hands to derive a control command therefrom; and d) transmitting said control command to a controlled device.03-15-2012
20120062452METHOD AND APPARATUS FOR TEACHING A CHILD WITH AN ELECTRONIC DEVICE - A handheld electronic device that promotes learning by movement The device includes software that guides the user through various activities via audio and visual cues. The user interacts with the handheld device with movement.03-15-2012
20120062455MOTION BASED DISPLAY MANAGEMENT - A display manager is configured to handle the drawing of windows on one or more displays for an application differently based on detected motion information that is associated with a device. The display manager may not display windows for some applications while motion is detected, while the display manager may display windows for other applications even when motion is detected. Motion enabled applications may interact with the display manager and motion information to determine how to display windows while motion is detected.03-15-2012
20120154272MOTION-BASED TRACKING - Techniques are disclosed for determining a user's motion in relation to displayed images. According to one general aspect, a first captured image is accessed. The first captured image includes (1) a first displayed image produced at a first point in time, and (2) a user. A second captured image is accessed. The second captured image includes (1) a second displayed image produced at a second point in time, and (2) the user. First information indicating motion associated with one or more objects in the first and second displayed images is accessed. Second information indicating both motion of the user and the motion associated with the one or more objects in the first and second displayed images is determined.06-21-2012
20120154267Sphere-Like Input Device - Embodiments of the invention provide a human interface device including an inner sphere, wherein the inner sphere has a center point. The human interface device can further include an outer sphere, and the outer sphere may be compressible. The human interface device may also include a plurality of pressure sensors between the inner sphere and the outer sphere for detecting localized compression of the outer sphere, a first three-axis-accelerometer located within the inner sphere, and a second three-axis-accelerometer located within the inner sphere, wherein the first three-axis-accelerometer and the second three-axis-accelerometer-accelerometer are each located at least a predetermined distance from the center point.06-21-2012
20120206340DISPLAY METHOD AND DISPLAY APPARATUS - To provide a display method capable of displaying the original power saving effect if a monitoring display of the power saving effect is not made when the monitoring display of the power saving effect is made. Provided is a display method including capturing an image in front of an image display surface included in a display apparatus that displays the image and detecting presence of a moving body positioned in front of the image display surface, deciding a power saving amount of the display apparatus including the power saving amount of the image display surface by using an analysis result of the presence or absence of a human face and a detection result of the moving body in the captured image, and deriving and displaying the actual power saving amount when no display is made on the image display surface on the image display surface when information about the power saving amount is displayed on the image display surface. Accordingly, when a monitoring display of a power saving effect is made, an original power saving effect if no monitoring display of the power saving effect is made can be displayed.08-16-2012
20120206335AR GLASSES WITH EVENT, SENSOR, AND USER ACTION BASED DIRECT CONTROL OF EXTERNAL DEVICES WITH FEEDBACK - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes an event, sensor, and user action based direct control of external devices with feedback.08-16-2012
20120206334AR GLASSES WITH EVENT AND USER ACTION CAPTURE DEVICE CONTROL OF EXTERNAL APPLICATIONS - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes an event and user action capture device control of external applications.08-16-2012
20120154268REMOTE CONTROL SYSTEMS THAT CAN DISTINGUISH STRAY LIGHT SOURCES - Remote control systems that can distinguish predetermined light sources from stray light sources, e.g., environmental light sources and/or reflections are provided. The predetermined light sources can be disposed in asymmetric substantially linear or two-dimensional patterns. The predetermined light sources also can be configured to exhibit signature characteristics. The predetermined light sources also can output light at different signature wavelengths. The predetermined light sources also can emit light polarized in one or more predetermined polarization axes. Remote control systems of the present invention also can include methods for adjusting an allocation of predetermined light sources and/or the technique used to distinguish the predetermined light sources from the stray light sources.06-21-2012
20120154269COORDINATE INFORMATION UPDATING DEVICE AND COORDINATE INFORMATION GENERATING DEVICE - An object can be displayed on a screen of a two-dimensional coordinate system based on xyz-coordinate values of the object in a three-dimensional coordinate system, operation information of a two-dimensional coordinate system with respect to the object can be received from an input device, and whether the operation information is in accordance with a predetermined rule or not is determined. If the operation information is not in accordance with the predetermined rule, xy-coordinate values of the object can be updated in accordance with the operation information. If the operation information is in accordance with the predetermined rule, a z-coordinate value of the object can be updated in accordance with the operation information.06-21-2012
20120154265MOBILE TERMINAL AND METHOD OF CONTROLLING A MODE SCREEN DISPLAY THEREIN - A mobile terminal including a communication unit configured to communicate with at least one external terminal; a memory configured to store at least a first and second operating system including at least first and second modes, respectively; and a controller configured to execute the first operating system and to activate the first mode corresponding to the first operating system, to display a first information screen corresponding to the activated first mode on a display of the mobile terminal, to receive an event signal indicating an event related to the second mode has occurred on the mobile terminal, and to selectively display event information related to the event of the second mode on a display of the at least one external terminal.06-21-2012
20120154266APPARATUS AND METHOD FOR CONTROLLING DATA IN PORTABLE TERMINAL - Provided is an apparatus and method for controlling data in a portable terminal, which can control data in the portable terminal while minimizing the motion of both hands of a user of the portable terminal. An apparatus for controlling data in a portable terminal includes a plurality of sensors disposed at the sides of the portable terminal to detect data control actions, and a control unit for controlling the data according to the data control actions detected by the plurality of sensors.06-21-2012
20090027330DEVICE FOR USING VIRTUAL MOUSE AND GAMING MACHINE - A virtual mouse device is preferably installed in a gaming machine, and serving as an input device. An image sensor unit is laminated on a specific area on a screen of a display unit, and detects fingers or a palm of a player that move on or over the specific area. The virtual mouse controller unit monitors the fingers or palm, and causes a virtual mouse to follow the fingers or palm within the specific area. If the fingers or palm moves out of the specific area, the virtual mouse controller unit then returns the virtual mouse to a default location. An input unit monitors the motion of the virtual mouse, and causes the display unit to move a mouse pointer on the screen depending on the amount and direction of travel of the virtual mouse.01-29-2009
20110074666REMOTE CONTROL SYSTEM FOR MULTI-SCREEN DISPLAY - A remote control system for a multi-screen display of an array of sub-displays tiled together contiguously includes a remote controller and a number of control boxes each of which is installed on a corresponding sub-display. The remote control is configured for receiving user input governing selection and control of the sub-displays, generating a control signal based upon the selection and the control of the sub-displays, and transmitting the control signal. Each control box is configured for receiving the control signal, identifying if the control signal is passed and transferred to a controller of corresponding sub-displays upon the selection of the sub-displays.03-31-2011
20130009866INPUT PROCESSING APPARATUS - In a keyboard input device, a first input device and a second input device each including a stick pointer are arranged. An input control of gesture functions such as zoom-in, zoom-out, right rotation, left rotation, forward tracking, backward tracking, left tracking, right tracking, and the like may be made possible by a combination of operational directions of operation bodies of the stick pointer (SP01-10-2013
20120105312User Input Device - A user input device is described. In an embodiment the user input device is hand held and comprises a sensing strip to detect one-dimensional motion of a user's finger or thumb along the sensing strip and to detect position of a user's finger or thumb on the sensing strip. In an embodiment the sensed data is used for cursor movement and/or text input at a master device. In an example the user input device has an orientation sensor and orientation of the device influences orientation of a cursor. For example, a user may move the cursor in a straight line in the pointing direction of the cursor by sliding a finger or thumb along the sensing strip. In an example, an alphabetical scale is displayed and a user is able to zoom into the scale and select letters for text input using the sensing strip.05-03-2012
20120249417INPUT APPARATUS - An input apparatus includes a gesture sensing unit for sensing a hand gesture of a user; and an input signal generation unit for generating an input signal for control of a target electronic device based on the sensed hand gesture. The hand gesture of the user includes at least one of a pinching gesture with two or more fingers, a pointing gesture with one finger, and a scratching gesture with one finger to a camera.10-04-2012
20120249408OPTO-ELECTRONIC DISPLAY ASSEMBLY - An electronic display arrangement for taking light signals forming an image emitted from a miniature screen (10-04-2012
20110090146POSITION DETECTOR AND POSITION INDICATOR - A position detector includes a position indicator including a power source and configured to intermittently transmit a position indication signal to a tablet at a predetermined timing; and the tablet configured to detect a position on its surface pointed to by the position indicator by receiving the position indication signal. The position indicator further includes an information storing section configured to store multiple types of information; a control signal receiving circuitry configured to receive a control signal transmitted from the tablet; an information selecting circuitry configured to select one type of information from among the multiple types of information stored in the information storing section in accordance with a content of the control signal; and an information transmitting circuitry configured to transmit the one type of information selected by the information selecting circuitry to the tablet.04-21-2011
20110090145METHOD AND ELECTRONIC SYSTEM FOR MULTI-DIRECTIONAL INPUT - A multi-directional input method is applicable to an authentication process of electronic software run on an electronic system. According to the method, when the authentication process is initialized, an authentication module having a multi-directional input object and entry fields is loaded. The multi-directional input object has multiple input sides, which together constitute a polyhedron and each have multiple input units. The input units each corresponds to at least one input options can be selected via an input device of the electronic system to constitute a required authentication data for inputting to the entry field. The authentication module verifies the validity of the input authentication data, and a next function of the electronic system is provided for use when the input authentication data is verified as valid. The multi-directional input method prevents authentication data from being illegally recorded to thereby provide enhanced security protection.04-21-2011
20110102315REMOTE CONTROL POINTING TECHNOLOGY - A pointing device (05-05-2011
20110102314Dual-screen electronic reader with tilt detection for page navigation - A dual screen electronic reader and method for navigating a document are disclosed. The electronic reader includes two display screens which can be angled to each other, like an open book, for viewing by a person reading the document. The electronic reader includes a tilt detection system for detecting tilting of the electronic reader indicative that the reader has completed reading the page on the first screen and has pivoted the electronic reader to view the opposite screen. This causes the electronic reader to load a fresh page on the first screen, optionally after a short time delay, which allows for counter-rotational tilting to be taken into consideration.05-05-2011
20110102317SEMICONDUCTOR INTEGRATED CIRCUIT DEVICE, FACILITY APPLIANCE CONTROL DEVICE, AND APPLIANCE STATE DISPLAY APPARATUS - An application program changes a property value of a graphic object arranged in an object database. An object manager reads out the property value from the object database and then issues a drawing command. A graphics engine executes the drawing command to configure a memory image of the graphic object on a VRAM to display the image on a liquid crystal display via an LCDC.05-05-2011
20110102316Extensible User Interface For Digital Display Devices - In one embodiment, a system to manage video content comprises an index file management module comprising logic to generate an index file to describe content in an associated video file, store the index file for a video file in a first memory location, separate from a second memory location in which the video file is stored, receive, from a requesting entity, a request for access to the index file, in response to the request, download the index file to the requesting entity, and download the video file to the requesting entity.05-05-2011
20120119985METHOD FOR USER GESTURE RECOGNITION IN MULTIMEDIA DEVICE AND MULTIMEDIA DEVICE THEREOF - A display device includes a first sensor to detect a first image of a person and a second sensor to detect a second image of the person. A storage device stores first information and second information, where the first information identifies a plurality of gestures mapped to respective ones of a plurality of functions identified by the second information. A processor recognizes a gesture of the person based on the first and second images, and performs a function corresponding to the recognized gesture based on the first and second information.05-17-2012
20120119988IMAGE RECOGNITION APPARATUS, OPERATION DETERMINING METHOD AND COMPUTER-READABLE MEDIUM - An accurate determination of an operation is possible. Data photographed by a video camera is read by an image reading unit, and an image of an operator is extracted from the data by an image extracting unit. As a result of such preparation, a virtual operation screen and an operation region are created based upon the extracted image of the operator. In a case of an adult operator, an operation region can be created in consideration with a length (position of sight line) or a length of an arm, and in a case of a child, since a length is lower and a length of an arm is shorter, an operation region can be set to match it.05-17-2012
20120119987METHOD AND APPARATUS FOR PERFORMING GESTURE RECOGNITION USING OBJECT IN MULTIMEDIA DEVICES - According to an embodiment of the present invention, a gesture recognition method for use in a multimedia device includes capturing, via an image sensing unit of the multimedia device, a peripheral image, recognizing a first object contained in the captured peripheral image and a gesture made using the first object, mapping a multimedia device operation to the gesture, and entering into an input standby mode associated with the gesture.05-17-2012
20120119986THREE-DIMENSIONAL CONTROL APPARATUS OF COMPUTER INPUT DEVICE AND METHOD THEREOF - A three-dimensional (3D) control apparatus for a computer input device, used to sense 3D motion relations to output multiple 3D control signals, includes a first element comprising a plurality of sensing units and a second element comprising a plurality of transmitting coils. Each sensing unit of the first element has two adjacent receiving coils. The second element is capable of performing a 3D multi-axial motion with respect to the first element to generate multiple 3D control signals according to relative position relations between the plurality of transmitting coils and the plurality of receiving coils, so as to perform 3D control.05-17-2012
20120119984HAND POSE RECOGNITION - Hand pose recognition comprises determining an initial hand pose estimate for a captured input hand pose and performing iterations based upon hand pose estimates and residues between such estimates and hand pose estimates. One or more control signals are generated based upon the hand pose recognition.05-17-2012
20100245235ELECTRONIC DEVICE WITH VIRTUAL KEYBOARD FUNCTION - An electronic device includes a body, a rotatable member, a projecting unit configured for projecting an image, a sensing unit and a processing unit. The body includes a connecting end, and a plurality of conductive contacts is formed at the connecting end. The rotatable member is rotatably connected to the connecting end. The rotating member includes a conductive ball configured for being electrically connected to one of the plurality of the conductive contacts, and the plurality of conductive contacts are formed along a displacement path of the conductive ball. The sensing unit is configured for sensing interactions at specific locations corresponding to the projected image. The processing unit stores a plurality of images, and is configured for controlling the projecting unit to project a corresponding image according to the conductive contact which is electrically connected to the conductive ball.09-30-2010
20100245230HANDWRITING RECOGNITION IN ELECTRONIC DEVICES - The present invention provides a method of inputting characters into a handheld device, comprising steps of: reading handwriting information; recognizing said handwriting information in one active recognition mode and at least one inactive recognition mode; displaying at least one character candidate obtained in said active recognition mode and at least one character candidate obtained in said at least one inactive recognition mode; and inputting into said handheld device a desired character candidate selected by a user among said character candidates being displayed. The present invention also provides a corresponding apparatus for inputting characters into a handheld device, and a related handheld device. A user no longer needs to designate handwriting recognition modes, and recognition accuracy is greatly improved.09-30-2010
20100245232Handheld Computer Interface With Haptic Feedback - A handheld computer interface includes an enclosure, a mass coupled to the enclosure, and an actuator coupled to the mass to change a position of the mass relative to the enclosure. When the actuator receives a signal indicating a change in the center of mass of the interface, it changes the position of the mass.09-30-2010
20100289739STORAGE MEDIUM STORING INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - A game apparatus saves, in accordance with an instruction of a user, a photographing image taken by an inward camera or an outward camera, or a handwriting image of a handwritten note inputted using a touch panel. Moreover, in accordance with an instruction of the user, position information is saved together with the photographing image or the handwriting image. That is, a photographing place or a creation place of the handwritten note is registered on a map. When the photographing image or the handwriting image is reproduced, an image of a landmark set near a position on the map indicated by the position information is reproduced before the photographing image or the handwritten image.11-18-2010
20100289738Stone, Portable Hand Held Device for Inputting Characters Into a Computer, Cell Phone, or any Programmable Device - This is an input devices that allows any type of character to be quickly and easily entered into any computing device with one hand, while being held by the same hand. The characters are arranged in arrays that can be displayed in a format of rows and columns. The user maneuvers the cursor through the tables using wheel type directional switches that can index continuously if desired. Normally, the thumb moves one wheel, or a ball, which can move the cursor over both rows and columns, and the index finger moves the other wheel or ball which changes the characters in the rows or columns allowing access to an unlimited number of characters. When the user arrives at the desired character, a switch controlled by one of the other fingers on the same hand then enters that character into the device. Double clicking, or pressing the switch quickly twice can make necessary adjustments to the character as desired, such as capitalizing a letter, or adding an umlaut. This device can also be used for accessing window style icons allowing the user to navigate through the operating system.11-18-2010
20100289737PORTABLE ELECTRONIC APPARATUS, OPERATION DETECTING METHOD FOR THE PORTABLE ELECTRONIC APPARATUS, AND CONTROL METHOD FOR THE PORTABLE ELECTRONIC APPARATUS - A portable electronic apparatus includes; plural sensor elements L11-18-2010
20100245241Apparatus and method for controlling functions of mobile terminal - A portable terminal includes an apparatus for controlling an operation of a mobile terminal. More particularly, an apparatus and a method sets a menu to control a mobile terminal in a specific region of an image stored in advance. A relevant function is performed by selecting the specific region of the image in which the menu has been set in the mobile terminal, thereby performing the relevant function fast and conveniently without separately selecting a menu (entering a menu). The apparatus includes a controller. The controller selects a specific region from an output image, sets a menu for controlling a function of the mobile terminal at the specific region, and stores the image where the menu has been set.09-30-2010
20100245240Electronic Device with a Display Unit Being Movable in Relation to a Base Unit - The invention relates to an electronic device and a method for controlling functionality of such an electronic device in relation to a display unit. The electronic device includes a base unit (09-30-2010
20100245231INPUT DEVICE - Provided is an input device which has excellent operability by not requiring a finger to be released from a key during operation requiring the finger to shift and preventing the finger from touching a plurality of keys at the same time. A key (09-30-2010
20100245239PRESSURE SENSING CONTROLLER - Embodiments of a pressure sensing controller implement grip and pressure sensing, as well as standard input control actuation, to provide control input by a user. The disclosed grip and pressure sensing control can be implemented in hand-held game controllers, control devices for appliances, cellular telephones, and any other type of devices that require control input. In the case of an existing control device with predefined control output, user programming of input settings to define command extensions allows extended gripping and pressure control input to be combined within the capable existing control outputs of the device.09-30-2010
20100245238INPUT DEVICE AND METHOD, INFORMATION PROCESSING DEVICE AND METHOD, INFORMATION PROCESSING SYSTEM, AND PROGRAM - An input device includes an operating unit that a user grasps and operates in a three-dimensional free space in order to remotely operate an information processing device; and a transmitting unit to transmit a signal for a first gesture in the free space of the operating unit to set a mode, and a signal for a second gesture in the free space of the operating unit which differs from the first gesture to execute processing in the mode set based on the first gesture.09-30-2010
20100245236COMPUTER-READABLE STORAGE MEDIUM AND INFORMATION PROCESSING APPARATUS - When a gravity center position enters a right input area, a motion of a character in a virtual game space pushing down the right pedal of a bicycle is started, and the bicycle is accelerated. Further, a reference position is moved so as to gradually approach the gravity center position, and the boundary between a non-input area and a left input area is also moved so as to gradually approach the gravity center position in conjunction with the reference position. When the gravity center position enters the left input area, a motion of the character in the virtual game space pushing down the left pedal of the bicycle is started, and the bicycle is accelerated. Further, the reference position is moved so as to gradually approach the gravity center position, and the boundary between the non-input area and the right input area is also moved so as to gradually approach the gravity center position in conjunction with the reference position.09-30-2010
20100245233MOVING AN OBJECT WITHIN A VIRTUAL ENVIRONMENT - A method of moving an object within a virtual environment, the method comprising: providing a reference line within the virtual environment; based on a position of the object relative to the reference line, determining a target position for the object within the virtual environment; and controlling the object so as to guide the object within the virtual environment towards the target position.09-30-2010
20120313850DISPLAY APPARATUS - A display apparatus includes: a light emitting unit which emits light; a light scanning unit which includes a light reflector and scans the light in first and second directions; an amplitude changing unit which changes an amplitude of the swing of the light reflector; and a light emitting control unit which adjusts an amount of light of the light emitting unit, wherein the amplitude changing unit allows a first state where the light is scanned in a first region of the display surface and a second state where the light is scanned on the first region and a second region of the display surface is switched, and wherein the light emitting control unit allows an amount of light per unit area of the first region in the first state and an amount of light per unit area of the first region in the second state to be equalized.12-13-2012
20120162064COORDINATE SENSOR AND DISPLAY DEVICE - The amount of the light emitted from a light-emitting diode is changed from that of the initial condition to increase, among the light sensors that are not shielded from the light emitted from the light-emitting diode, the number of the light sensors at which the difference (C−D) between the light detection amount at the light sensor when the light-emitting diode is ON and the light detection amount at the light sensor when the light-emitting diode is OFF is equal to the difference (A−B) between the light detection amount at the light sensor in the initial condition when the light-emitting diode was ON and the light detection amount at the light sensors in the initial condition when the light-emitting diode was OFF with no detection object present. As a result, regardless of the presence or absence of the detection object, the coordinate sensor and the display device disclosed can establish a threshold for determining the presence or absence of the detection object, and can also determine the coordinates of the detection object in a stable manner regardless of changes in the ambient environmental light or in the ambient environmental temperature, or fluctuations in the amount of light emitted from the light-emitting element disposed in the coordinate sensor or the change in the sensitivity of the light-receiving elements.06-28-2012
20120127072CONTROL METHOD USING VOICE AND GESTURE IN MULTIMEDIA DEVICE AND MULTIMEDIA DEVICE THEREOF - A multimedia device and a method for controlling the same are disclosed, in which voice and gesture of a user are recognized by the multimedia device to allow the user to execute a desired operation. The method Includes enabling an input of a remote controller input of a gesture and a voice; receiving user the gesture and the voice through the remote controller; identifying a first command associated with the received gesture; identifying a second command associated with the received voice; comparing the first command and the second command to each other; and performing a function associated with the first or second command when the comparing step indicates that the first command corresponds to the second command. The multimedia device executes the operation desired by the user.05-24-2012
20120127071Haptic Feedback to Abnormal Computing Events - A computer-implemented tactile feedback method includes receiving user input on a computing device, identifying a term input by the user that does not match a term known to the device, accessing an auto-correction service in order to provide a replacement for the term, and energizing a haptic feedback device in response to identifying the term input by the user that does not match a known term.05-24-2012
20120127070CONTROL SIGNAL INPUT DEVICE AND METHOD USING POSTURE RECOGNITION - Provided are a control signal input device and method using posture recognition. More particularly, the present invention relates to a control signal input device including: a database unit storing predetermined system control commands corresponding to postures of combinations of one or more of an arm, a wrist, and fingers of a user; a sensing unit sensing a posture of a combination of the arm, wrist, and fingers of the user; and a control signal generating unit extracting a system control command corresponding to the sensed result of the sensing unit from the database unit and generating a control signal for controlling the system, and a control signal input method using the same.05-24-2012
20120127069Input Panel on a Display Device - A device including a sensor to detect a holding position of a user of the device, an orientation sensor to detect an orientation of the device, and a controller to render an input panel on at least one location of a display device based on the holding position of the user and the orientation of the device.05-24-2012
20120162065SKELETAL JOINT RECOGNITION AND TRACKING SYSTEM - A system and method are disclosed for recognizing and tracking a user's skeletal joints with a NUI system and further, for recognizing and tracking only some skeletal joints, such as for example a user's upper body. The system may include a limb identification engine which may use various methods to evaluate, identify and track positions of body parts of one or more users in a scene. In examples, further processing efficiency may be achieved by segmenting the field of view in smaller zones, and focusing on one zone at a time. Moreover, each zone may have its own set of predefined gestures which are recognized.06-28-2012
20120235898ELECTRONIC DEVICE AND COMMUNICATION DEVICE - There is provided an electronic device which includes a display device, a case configured to retain the display device, a stopper disposed between the retained display device and the case, and a component arranged in a space formed by disposing the stopper. Further, there is provided an electronic device which includes a display device, a case including a first space for retaining the display device, the display device being operable to be slid into the first space, a stopper disposed in a second space between the retained display device and the case, and an electronic device arranged in a third space formed by arranging the stopper.09-20-2012
20120162066METHODS AND SYSTEMS FOR PROVIDING SENSORY INFORMATION TO DEVICES AND PERIPHERALS - Peripherals and data processing systems are disclosed which can be configured to interact based upon sensor data. In at least certain embodiments, a method for sensing motion and orientation information for a device includes receiving a motion event from at least one sensor located in a device. The method further includes determining an orientation for a display of the device. The method further includes determining whether the device is currently moving. The method further includes determining whether the device moves within an angle with respect to a ground reference for a first time period. The method further includes switching the orientation of the display of the device if the device moves in excess of the angle.06-28-2012
20120162063INFORMATION DISPLAY APPARATUS, INFORMATION DISPLAY METHOD, AND STORAGE MEDIUM STORING PROGRAM FOR DISPLAYING INFORMATION - An information display method, includes inputting a designated point in each of the series of image data items of the image file, which is sequentially designated as the analysis target point by a user's manipulation, every time each of the series of image data items of the image file is sequentially displayed on the display screen, displaying each of the series of designated points, which are sequentially input, at each input position, sequentially displaying each of the series of image data items stored in the designated image file along with each of the series of designated points while each of the series of designated points is displayed, and displaying a designated point after a change in a display mode different from that of a designated point before the change, every time a transition direction of each of the series of designated points changes.06-28-2012
20120162062ELECTRONIC BOOK READING DEVICE WITH AIRFLOW SENSOR - An electronic book reading device includes a housing, an airflow sensor and a controller. The airflow sensor includes a turbine, a shaft fixed to the turbine and a tachometer. The turbine is configured for sensing an airflow and rotating with and about the shaft. The tachometer is connected to the shaft and configured for detecting a rotation speed of the turbine. The controller is configured for determining whether the detected rotation speed of the turbine falls in a predetermined range, and controlling the electronic book reading device to execute a predetermined operation if the detected rotation speed of the turbine falls in the predetermined range.06-28-2012
20120162061ACTIVATION OBJECTS FOR INTERACTIVE SYSTEMS - An interactive system can include a display device, a control panel, a projector, a processing device, and an input device. The input device can detect indicia of its posture relative to the control panel or a display surface of the display device, so that the interactive system can recognize interactions between the input device and these components. Interactions between the input device and the display surface can be captured, analyzed by the processing device, and then represented in an image projected onto the display surface. The control panel can comprise a plurality of non-projected, tactile activation objects, each of which can correspond to an activity or function of the interactive system, such as, for example, powering on the projector. When the interactive system detects an interaction between the input device and an activation object, the activity corresponding to the selected activation object can be performed.06-28-2012
20120162060LASER NAVIGATION MODULE - Disclosed herein is a laser navigation module, including: a light source radiating a laser beam; a housing provided with a window transmitting and reflecting the laser beam radiated from the light source and shielding the introduction of visible rays and provided with a transparent part or a translucent part to radiate light radiated from the inside thereof to the outside; a lighting device mounted in the housing; and a light diffusing member transferring the light radiated from the lighting device and having a groove part formed at an edge of an opposite side of the lighting device.06-28-2012
20120162058SYSTEMS AND METHODS FOR SHARED DISPLAY IN A HYBRID ENVIRONMENT - Embodiments operating shared peripherals in a hybrid computing system are described. Embodiments control one or more shared peripheral devices variously between a primary system and a secondary system via one or more communication links, where the secondary system is detachable from the primary system and operates as an independent computing device in the disconnected state, while operating as a display device in the connected state.06-28-2012
20110181504Electronic Device - Provided is an electronic device which easily selects and executes an application relating to characters inputted by a user. A cellular phone (07-28-2011
20110181503REPRODUCTION DEVICE, REPRODUCTION SYSTEM AND NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM - A reproduction device comprises an operation input unit that an operation command is input, a communication unit that connects with another reproduction device via a communication network, and a reproduction unit that reproduces data. Furthermore, a control unit of the reproduction device that, when an operation command that specifies the data that is to be coordinated and reproduced by each of the another reproduction device and the reproduction unit is input to the operation input unit, transmits a coordinated reproduction command for coordinating and reproducing the specified data to the another reproduction device via the communication unit, acquires the specified data to be reproduced and causes the reproduction unit to reproduce the acquired data.07-28-2011
20120313849DISPLAY APPARATUS AND METHOD FOR EXECUTING LINK AND METHOD FOR RECOGNIZING VOICE THEREOF - A display apparatus and a method for executing a link and a method for recognizing a voice thereof are provided. The method for executing a link of the display apparatus includes displaying a user interface, determining a text included in a link included in the user interface, displaying the text determined in the link to be distinguished from other texts, recognizing a voice input from a user, and if the voice uttered by the user matches the text determined in the link, executing the link associated with the matching text. Accordingly, possibility of misrecognition of a voice input by the user is reduced, and the user controls a display apparatus using more exact voice recognition.12-13-2012
20120313851INFORMATION PROCESSING APPARATUS AND PROGRAM - An information processing apparatus includes an imaging unit, a display, a detection unit, and an image generation unit. The imaging unit is configured to capture an image to acquire a captured image. The display has a display surface that faces in the same direction as an imaging direction of the imaging unit. The detection unit is configured to perform imaging processing on the captured image to detect a face region that is a region of a face of a user in the captured image. The image generation unit is configured to generate a display image displayed on the display based on a result of the detection by the detection unit.12-13-2012
20120313848Three Dimensional User Interface Session Control - A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.12-13-2012
20120313847METHOD AND APPARATUS FOR CONTEXTUAL GESTURE RECOGNITION - Methods, apparatuses and computer program products are provided for facilitating interaction via motion gestures. A method may include receiving an indication of at least one motion gesture made with a device. The method may further include determining a contextual state of a device. The method may additionally include determining, by a processor, a relationship between the at least one motion gesture and each of a plurality of predefined motion gestures and causing, based at least in part on the determined relationship, the device to perform an action associated with a respective predefined gesture. Corresponding apparatuses and computer program products are also provided.12-13-2012
20120133579GESTURE RECOGNITION MANAGEMENT - A system and method for managing the recognition and processing of gestures. A system provides a mechanism to detect conflicts between gesture recognizers and resolve the conflicts. A runtime system receives notifications from gesture recognizers in the form of requests for resources or actions. A conflict detector determines whether a conflict with another gesture recognizer exists. If a conflict exists, a conflict resolver determines a resolution. This may include determining a winning gesture recognizer and deactivating the losing gesture recognizers. A design time system statically validates gesture recognizers based on static state machines corresponding to each gesture recognizer.05-31-2012
20120133581HUMAN-COMPUTER INTERACTION DEVICE AND AN APPARATUS AND METHOD FOR APPLYING THE DEVICE INTO A VIRTUAL WORLD - A human-computer interaction device and an apparatus and method for applying the device into a virtual world. The human-computer interaction device is disposed with a sensing device thereon, the sensing device including a manipulation part and a distance sensor. The manipulation part receives a manipulation action of a user's finger, the distance sensor senses a distance of the manipulation part relative to a fixed location and generates a distance signal for characterizing the manipulation action. A virtual world assistant apparatus and a method corresponding to the assistant apparatus is also provided. With the invention, multiple signals of manipulation can be sensed and free control on actions of an avatar can be realized by using the multiple signals.05-31-2012
20120133580SYSTEM AND METHOD FOR GESTURE INTERFACE CONTROL - A method is provided in one example and includes generating a histogram associated with at least one object; receiving image data; comparing the image data to the histogram in order to determine if at least a portion of the image data corresponds to the histogram; identifying a pose associated with the object; and triggering an electronic command associated with the pose. In more particular embodiments, the image data is evaluated in order to analyze sequences of poses associated with a gesture that signals the electronic command to be performed.05-31-2012
20120162057SENSING USER INPUT USING THE BODY AS AN ANTENNA - A human input system is described herein that provides an interaction modality that utilizes the human body as an antenna to receive electromagnetic noise that exists in various environments. By observing the properties of the noise picked up by the body, the system can infer human input on and around existing surfaces and objects. Home power lines have been shown to be a relatively good transmitting antenna that creates a particularly noisy environment. The human input system leverages the body as a receiving antenna and electromagnetic noise modulation for gestural interaction. It is possible to robustly recognize touched locations on an uninstrumented home wall using no specialized sensors. The receiving device for which the human body is the antenna can be built into common, widely available electronics, such as mobile phones or other devices the user is likely to commonly carry.06-28-2012
20120299825MOBILE DEVICE AND DISPLAY CONTROL METHOD - According to an aspect, a mobile device includes a housing, a display unit, a storage unit, and a control unit. The housing is configured to be switched to an opened state or a closed state. The display unit displays a standby screen. The display unit is exposed in any state of the opened state and the closed state of the housing. The storage unit stores a first object and a second object to be displayed on the standby screen. The control unit causes the first object to be displayed in a first state where the standby screen is displayed in the closed state, and the second object to be displayed in a second state where the standby screen is displayed in the opened state.11-29-2012
20120299824INFORMATION PROCESSING DEVICE, PORTABLE DEVICE AND INFORMATION PROCESSING SYSTEM - To take security into account and increase user friendliness, an information processing device includes: an input unit to which information is input; an extracting unit extracting predetermined words from the information input to the input unit; a classifying unit classifying the words extracted by the extracting unit into first words and second words; and a converting unit converting the first words by a first conversion method and converting the second words by a second conversion method, the second conversion method being different from the first conversion method.11-29-2012
20120299823PROJECTION CONTROLLING APPARATUS AND PROJECTION CONTROLLING METHOD - According to an aspect, a projection controlling apparatus for causing a projector to project an image includes a detector, an extractor, and a projection unit. The detector detects a point directed to a first image projected by the projector. The extractor extracts a first object contained in the first image based on the point. The projection unit causes the projector to project a second image that is an image with a pointing index added to the first object in the first image.11-29-2012
20120299822Communication and Device Control System Based on Multi-Frequency, Multi-Phase Encoded Visual Evoked Brain Waves - A driving control system actuated by visual evoked brain waves which are induced by a multi-frequency and multi-phase encoder, the driving control system includes an optical flash generating device, a brain wave signal measurement device, a signal processing and analyzing device and a control device. The brain wave signal measurement device is configured for measuring a steady-state visual evoked response (SSVER) signal inducing by a user gazing the flash light source generated by the optical flash generating device. The signal processing and analyzing device is configured for calculating the frequency parameter and the phase parameter of the SSVER signal by a mathematical method, and analyzing whether those parameters are same as the optical flash generating device's parameters so as to generate a judgment result. The control device generates a control command according to the judgment result for controlling at least one of peripheral equipments.11-29-2012
20120299821FLEXIBLE FINGERPRINT SENSOR - A flexible pressure sensor has a first set of substantially parallel conductors in the x direction, a second set of substantially parallel conductors in the y direction, and a composite material disposed between the first set and second set of conductors. The composite material is capable of returning to substantially its original dimensions on release of pressure. The composite material includes conductive particles at least partially embedded in an elastomeric layer that have no relative orientation and are disposed within the elastomeric layer for electrically connecting the first set and second set of conductors in the z direction under application of sufficient pressure there between.11-29-2012
20120299820TOUCHLESS INTERFACES - The shape or position of an object is estimated using a device comprising one or more transmitters and one or more receivers, forming a set of at least two transmitter-receiver combinations. Signals are transmitted from the transmitters, through air, to the object. They are reflected by the object and received by the receivers. A subset of the transmitter-receiver combinations which give rise to a received signal meeting a predetermined clarity criterion is determined. The positions of points on the object are estimated using substantially only signals from the subset of combinations.11-29-2012
20120299819SENSOR IMAGE DISPLAY DEVICE AND METHOD - This disclosure provides a sensor image display device, which includes a display unit configured to display a sensor image generated based on a signal acquired by a sensor, a selection user interface for allowing a user to select a partial area from an area where the sensor image is displayed, and a sensor image generator for generating the sensor image by performing different signal processing for the selected area and one or more areas other than the selected area.11-29-2012
20120299818MEDICAL INFORMATION DISPLAY APPARATUS, OPERATION METHOD OF THE SAME AND MEDICAL INFORMATION DISPLAY PROGRAM - One of hierarchical images constituted by an appearance image representing an appearance of a subject body and a plurality of gradually magnified anatomy images schematically representing anatomical structures present inside of the body is displayed on a display unit. If a structure present at a desired position in the displayed image on the display unit specified by the specification input corresponds to a target examined region, the displayed image is switched to the examination information associated with the target examined region to display the examination information.11-29-2012
20120299817Systems and Methods of Image Processing that Adjust for Viewer Position, Screen Size and Viewing Distance - Several embodiments of image processing systems and methods are disclosed herein whereby at least one characteristic of an image displayed on a target display is changed according to information regarding the viewer's position—e.g. distance to the target display, the visual angle the target display's subtends of the viewer's field of view. In one embodiment, luminance and/or contrast may be changed depending on the information regarding viewer's position relative to the target display.11-29-2012
20120299816HYBRID DISPLAY APPARATUS AND DISPLAY METHOD THEREOF - A hybrid display apparatus and a display method thereof are provided. In the apparatus, an emissive type display panel outputs an internal light to the outside and thereby displays data. A reflective type display panel passes the internal light to the outside, reflects an external light, and thereby displays the data. An intermediate film layer is interposed between the emissive type display panel and the reflective type display panel. The intermediate film layer passes the internal light from the emissive type display panel to the reflective type display panel, and also blocks the external light that passes through the reflective type display panel and is reflected at the emissive type display panel.11-29-2012
20120299815DISPLAY DEVICE AND METHOD FOR REMOTELY CONTROLLING DISPLAY DEVICE - A display device and a method for remotely controlling the display device are disclosed. A controller executes a first application and a second application. A display displays a main window, which displays an execution screen of the executed first application, and a first sub window which displays an execution screen of the executed second application, on a screen. A receiver receives a focus switching signal from a first remote controller and a coupling signal containing an identifier from a second remote controller. The controller switches a focused window based on the received focus switching signal and implements coupling between the focused window and the second remote controller in response to the received coupling signal.11-29-2012
20120299814MOBILE TERMINAL AND MODE CONTROLLING METHOD THEREIN - A mobile terminal including a communication unit; a memory configured to store at least first and second operating systems including at least first and second modes, respectively; and a controller configured to display, in a first display region of a display unit of the mobile terminal, a first application indicator corresponding to a first application executable in the first mode using the first operating system and that can be activated by selecting the first application indicator, to display, in a second display region, a second application indicator corresponding to a second application executable in the second mode using the second operating system and that can be activated by selecting the second application indicator. Further, the first and second application indicators indicate whether the applications are executable in the first mode or the second mode, or executable in both the first and second modes.11-29-2012
20120299813METHOD AND APPARATUS FOR INPUTTING USER COMMANDS USING RELATIVE MOVEMENTS OF DEVICE PANELS - A method and apparatus for inputting various operation instructions to a device including two movable panels. The method includes determining whether a relative angle between the first panel and the second panel is within an effective angle range; determining whether the relative angle within the effective angle range is maintained during an effective time; and inputting an operation instruction to the device based on whether the relative angle between the first panel and the second panel is within the effective angle range and whether the relative angle within the effective angle range is maintained during the effective time.11-29-2012
20120299812APPARATUS AND METHOD FOR CONTROLLING DATA OF EXTERNAL DEVICE IN PORTABLE TERMINAL - An apparatus and a method operate in a portable terminal to control data of an external device, for example, remotely controlling data output from the external device connected to the portable terminal without directly manipulating the portable terminal. The apparatus includes a camera for receiving an image of a hand and a controller for controlling data being output from the external device according to a gesture of the hand determined to be equivalent to a hand gesture image received through the camera while the controller is connected to and outputs data to the external device.11-29-2012
20120299811TRANSFERRING RUI FROM ONE DEVICE TO ANOTHER - A method of and system for transferring a remote user interface from one device to another device is described herein. A server stores state information and uses the information to transfer the RUI and/or other data from one device to the other device. This enables a user to transition from one device to another device seamlessly and without interruption of their user interface and/or programming.11-29-2012
20120249414Marking one or more items in response to determining device transfer - A computationally implemented method includes, but is not limited to: determining that a computing device that was presenting one or more portions of one or more items and that was in possession of a first user has been transferred from the first user to a second user; and marking, in response to said determining, the one or more portions of the one or more items to facilitate the computing device in returning to the one or more portions upon the computing device being at least transferred back to the first user. In addition to the foregoing, other method aspects are described in the claims, drawings, and text forming a part of the present disclosure.10-04-2012
20090058800INFORMATION PROCESSING DEVICE, PROGRAM, AND METHOD - According to one embodiment, an information processing device includes a position detecting section configured to detect a position of a hand from an input image of the hand, a memory section configured to store data of the position of the hand detected by the position detecting section, a rotation judging section configured to judge, assuming that records of the data of the position of the hand stored in the memory section show a rotary movement, that a latest position of the hand falls in an angle range predicted for the rotary movement, and an executing section configured to, when the rotation judging section judges that the latest position of the hand falls in the angle range, obtain a rotational angle at the latest position of the hand and also execute a process that corresponds to a predetermined rotary movement of the hand.03-05-2009
20090058799INTERACTIVE POINTING DEVICE - An interactive pointing device having pointing function in space and game control is provided in the present invention. The interactive pointing device comprises an accelerometer module and a gyroscope device. The accelerometer module functions as sensing the movement of the operator and generates at least one axis of accelerating signal corresponding to the sensed movement. The gyroscope device disposed on a turning mechanism functions as sensing rotation status of the interactive pointing device about at least one axis and generate a corresponding rotating signal. The turning mechanism can be operated to adjust the axis of the gyroscope device so that the gyroscope device is capable of sensing rotation status about different axes. The at least one accelerating signal and the rotating signal are then processed for controlling cursor movement of the electrical device and interacting with multimedia gaming programs.03-05-2009
20120169589Sphere-Like Input Device - Embodiments of the invention provide a human interface device including an inner sphere, wherein the inner sphere has a center point. The human interface device can further include an outer sphere, and the outer sphere may be compressible. The human interface device may also include a plurality of pressure sensors between the inner sphere and the outer sphere for detecting localized compression of the outer sphere, a first three-axis-accelerometer located within the inner sphere, and a second three-axis-accelerometer located within the inner sphere, wherein the first three-axis-accelerometer and the second three-axis-accelerometer-accelerometer are each located at least a predetermined distance from the center point.07-05-2012
20120169586VIRTUAL INTERFACE - A virtual interface including a virtual screen generator configured to produce a virtual display screen, a display generator configured to project at least one virtual display element onto the virtual display screen, and at least one sensor configured to detect interaction with the at least one virtual display element.07-05-2012
20120212405SYSTEM AND METHOD FOR PRESENTING VIRTUAL AND AUGMENTED REALITY SCENES TO A USER - A method according to a preferred embodiment can include providing an embeddable interface for a virtual or augmented reality scene, determining a real orientation of a viewer representative of a viewing orientation relative to a projection matrix, and determining a user orientation of a viewer representative of a viewing orientation relative to a nodal point. The method of the preferred embodiment can further include orienting the scene within the embeddable interface and displaying the scene within the embeddable interface on a device.08-23-2012
20120212409COMMUNICATION APPARATUS AND CONTROL METHOD - A communication apparatus includes a display unit, a communication unit, and a control unit. The display unit displays video data. The communication unit communicates with an external apparatus. The control unit controls the communication apparatus in accordance with a command received by the communication unit. If the display unit is in a mute state and the communication unit is sending the external apparatus the data for placing the external apparatus in a mute state, the control unit determines not to control the communication apparatus in accordance with the command.08-23-2012
20120212407INFORMATION DISPLAY DEVICE AND SCROLL CONTROL METHOD - According to an aspect, an information display device includes a display unit, a visual line detector, and a control unit. The display unit displays information in a scroll region, which is provided in at least a part of a screen of the display unit, while scrolling the information. The visual line detector detects a visual line position of an operator with respect to the scroll region. The control unit controls a scroll speed, at which the information is scrolled in the scroll region, based on the visual line position detected by the visual line detector.08-23-2012
20120212406AR GLASSES WITH EVENT AND SENSOR TRIGGERED AR EYEPIECE COMMAND AND CONTROL FACILITY OF THE AR EYEPIECE - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes event and sensor triggered command and control facility.08-23-2012
20120212403HANDHELD ELECTRONIC DEVICE AND FUNCTION CONTROL METHOD THEREOF - A handheld electronic device is provided. The electronic device includes an electrode unit, a storage unit, and a processing unit. The electrode unit includes a main body defining an annular cavity, a plurality of electrode groups, and a conductive element arranged within the annular cavity, wherein each of the plurality of the electrode groups includes a pair of conductive sheets, which are partially received in the annular cavity and are spaced apart from each other. When the electronic device is rotated to be in different orientation, the conductive element connects different electrode groups and the conductive sheets of the one of the electrode groups are connected to each other via the conductive element. The processing unit determines the connected electrode groups and executes a function corresponding to the determined electrode group. A function control method of the handheld electronic device is also provided.08-23-2012
20120212410OPERATION INPUT DEVICE - To provide an operation input device capable of realizing operation input by changing the posture of the operation input device even without a posture determining sensor. This is an operation input device for use by a user while being held by his/her hand, for obtaining a captured image captured by an image capturing unit provided to the operation input device, and determining the posture of the operation input device, based on the result of analysis on the captured image obtained.08-23-2012
20120212408IMAGE GENERATION DEVICE, PROJECTOR, AND IMAGE GENERATION METHOD - An image generation device includes a storage unit that stores input object data indicating content information of an input object which is input on a presentation screen, a determination unit that determines a display position located further on an upper side than an input position of the input object on the presentation screen, and an image generation unit that generates a presentation image which is an image displayed on the presentation screen and where the input object is displayed according to the display position determined by the determination unit.08-23-2012
20120176304PROJECTION DISPLAY APPARATUS - A projection display apparatus includes: an element controller that controls an imager to display a maximum light amount image and a minimum light amount image; a setting unit that sets a light amount threshold value used to detect pointing light emitted from a pointing device; and a specifying unit that specifies a light amount of the pointing light overlapping the minimum light amount image, based on a picked-up image, and to specify a light amount of the maximum light amount image, based on the picked-up image, wherein the setting unit sets the light amount threshold value, based on the specified light amount of the pointing light and the specified light amount of the maximum light amount image.07-12-2012
20120176308METHOD FOR SUPPORTING MULTIPLE MENUS AND INTERACTIVE INPUT SYSTEM EMPLOYING SAME - A method comprises receiving an input event associated with a first user ID, the input event being a command for displaying a first menu on a display surface; identifying a second menu associated with the first user ID currently being displayed on the display surface; dismissing the second menu; and displaying the first menu.07-12-2012
20120176306SYSTEM AND METHOD FOR PROVIDING SUBSTANTIALLY STABLE HAPTICS - A system for providing substantially stable haptics includes at least one computer configured to identify a first subset and a second subset of haptic interaction geometric primitives for a virtual tool. The computer is configured to determine based on the first subset, haptic forces in a first subspace. The computer is also configured to determine based on the second subset, haptic forces in a second subspace different from the first subspace.07-12-2012
20120176305DISPLAY APPARATUS CONTROLLED BY A MOTION, AND MOTION CONTROL METHOD THEREOF - A display apparatus includes a motion recognition unit which recognizes a movement of an object located outside the display apparatus, a storage unit which stores therein information about the operation corresponding to each motion; and a control unit which divides a movement recognition period using a movement nonrecognition period, determines a motion corresponding to a movement of the object within the movement recognition period, and performs an operation corresponding to the determined motion according to information stored in the storage unit.07-12-2012
20120176303GESTURE RECOGNITION APPARATUS AND METHOD OF GESTURE RECOGNITION - A gesture recognition apparatus (07-12-2012
20120176302OPTIMIZED ON-SCREEN KEYBOARD RESPONSIVE TO A USER'S THOUGHTS AND FACIAL EXPRESSIONS - A system and method that allows for control of an on-screen keyboard in response to brain activity input, even for those suffering from partial paralysis or locked-in syndrome. The system provides an on-screen keyboard arranged into blocks of keys for ease of navigation. In response to brain activity input, such as activity relating to thoughts or facial expressions, the system performs a variety of previously determined functions which control the on-screen keyboard. In response to the activity of the on-screen keyboard, the computer can be controlled in a manner similar to when using a standard hand-operated keyboard.07-12-2012
20120075179ELECTRONIC DEVICE - The electronic device has a displaying unit which displays information; an operating unit which accepts a contactless oberation by a user; a detecting unit which detects whether or not on object exists inside the area where the contact less operation is capable, and a lighting unit which lights when the detecting unit detected the existence of the object.03-29-2012
20120075178APPARATUS AND METHOD FOR GENERATING DYNAMIC RESPONSE - A dynamic response generating apparatus and method that may analyze an intention of a user based on user input information received from an inputting device, may analyze at least one of first response information with respect to the analyzed intention of the user, context information associated with the user input information, user motion information, and environmental information, may dynamically determine a modality with respect to the first response information, may process the first response information, and may dynamically generate second response information in a form of via the determined modality.03-29-2012
20120075177LAPEL MICROPHONE MICRO-DISPLAY SYSTEM INCORPORATING MOBILE INFORMATION ACCESS - A shoulder mounted lapel microphone housing that encloses a microdisplay, a computer, and other communication system components. A microdisplay element is located on or in the microphone housing. Other electronic circuits, such as a microcomputer, one or more wired and wireless interfaces, associated memory or storage devices, auxiliary device mounts and the like are packaged in the microphone housing and/or in an optional pager sized gateway device having a belt clip. Motion, gesture, and/or audio processing circuits in the system provide a way for the user to input commands to the system without a keyboard or mouse. The system provides connectivity to other computing devices such as cellular phones, smartphones, laptop computers, or the like.03-29-2012
20120075175METHOD AND DEVICE FOR PROVIDING SYSTEM STATUS INFORMATION - The present disclosure provides a method and device for providing system status information. The method comprises: receiving, from an input mechanism associated with a communication device, a request to share system status information; and in response to receiving the request to share the system status information: (i) obtaining system status information associated with the communication device; and (ii) automatically populating one or more portions of an electronic message based on the system status information. The system status information comprises processor usage information.03-29-2012
20120075174INPUT APPARATUS AND ELECTRONIC SYSTEM USING SAME - An input apparatus and an electronic system using the same are provided. The input apparatus includes a casing, a sensor and a processing circuit. The casing has a first surface and a second surface arranged in opposite side. The first surface has a first input interface and the second surface has a second input interface. The sensor is disposed in the casing and is configured for sensing whether one of the first and second surfaces is towards a predetermined direction, so as to generate a sensing result. The processing circuit is arranged in the casing and is in communication with the first input interface, the second input interface and the sensor, to determine based on the sensing result whether to perform an operation according to a command inputted from the first input interface or to perform an operation according to a command inputted from the second input interface.03-29-2012
20100039375Signal Processing Method of Multi-Finger Touch Supported Touch Apparatus having Hidden Physical Button - A signal processing method of a multi-finger touch supported touch apparatus having hidden physical button is applied to the multi-finger touch supported touch apparatus having at least one physical button element pair. The method includes the steps of scanning a touch sensor of the multi-finger touch supported touch apparatus continuously, judging whether the physical button element pair is pressed if a total number of the fingers detected is larger than zero and driving a corresponding touch application according to the total number of fingers of a gesture and a button. The structure of the multi-finger touch supported touch apparatus which is stacked includes a Mylar, a first adhesive layer, a printed circuit board having a first part of the physical button element pair and the touch sensor at least, a second adhesive layer, a first metallic layer for holding and a second metallic layer having a second part of the physical button element pair.02-18-2010
20100271299SELECTIVE INPUT SYSTEM AND PROCESS BASED ON TRACKING OF MOTION PARAMETERS OF AN INPUT OBJECT - A selective input system and associated method is provided which tracks the motion of a pointing device over a region or area. The pointing device can be a touchpad, a mouse, a pen, or any device capable of providing two or three-dimensional location. The region or area is preferably augmented with a printed or actual keyboard/pad. Alternatively, a representation of the location of the pointing device over a virtual keyboard/pad can be dynamically shown on an associated display. The system identifies selections of items or characters by detecting parameters of motion of the pointing device, such as length of motion, a change in direction, a change in velocity, and or a lack of motion at locations that correspond to features on the keyboard/pad. The input system is preferably coupled to a text disambiguation system such as a T9® or Sloppytype™ system, to improve the accuracy and usability of the input system.10-28-2010
20100271298HAPTIC AUTOMATED COMMUNICATION SYSTEM - A haptic communication system having a range of sensors embedded with an operator's attire. Data collected by the sensors is processed by a computing device local to the operator and is communicated via a haptic modality in real-time to other team members and robotic assets in the system.10-28-2010
20100271297NON-CONTACT TOUCHPAD APPARATUS AND METHOD FOR OPERATING THE SAME - A non-contact touchpad method uses an image sensor (10-28-2010
20120249411SCREEN PROTECTION SYSTEM AND METHOD OF AN ELECTRONIC DEVICE - In a screen protection system and method, a first operation temperature of an electronic device is detected from each of one or more first temperature sensors, and a first ambient temperature is detected from a second temperature sensor, when a display screen is activated. Once there is no operation on the electronic device, a timer is started timing. Once the electronic device is being operated, a duration is temporarily stored and the timer is reset. If the duration is equal to a screensaver time, a second operation temperature of the electronic device is detected from each of the first temperature sensors, and a second ambient temperature is detected from a second temperature sensor. The method further determines whether the electronic device is currently being held by a hand of a user, according to the above-mentioned temperatures. If the electronic device is not being held by a hand of the user, the display screen is controlled to be in an inactive state.10-04-2012
20120249413INPUT DEVICE AND IMAGE DISPLAY APPARATUS - An input device including: an operation device including: a flexible base member; a first detector configured to detect that the base member is being bent; and a second detector configured to detect that the base member is being nipped; and an output section connected to the first detector and the second detector, the output section being configured to output a first signal when the first detector has detected that the base member is being bent and configured to output a second signal when the second detector has detected that the base member is being nipped.10-04-2012
20120249410PROJECTION UNIT AND METHOD OF CONTROLLING THE SAME - A projection unit comprises a first light source outputting light that is used to project an image during normal projection unit use, and a second light source outputting light of a different intensity that is used to project an image outside of normal projection unit use.10-04-2012
20120249416MODULAR MOBILE CONNECTED PICO PROJECTORS FOR A LOCAL MULTI-USER COLLABORATION - The various embodiments include systems and methods for rendering images in a virtual or augmented reality system that may include capturing scene images of a scene in a vicinity of a first and a second projector, capturing spatial data with a sensor array in the vicinity of the first and second projectors, analyzing captured scene images to recognize body parts, and projecting images from each of the first and the second projectors with a shape and orientation determined based on the recognized body parts. Additional rendering operations may include tracking movements of the recognized body parts, applying a detection algorithm to the tracked movements to detect a predetermined gesture, applying a command corresponding to the detected predetermined gesture, and updating the projected images in response to the applied command.10-04-2012
20120249415SERVER, TERMINAL DEVICE, AND GROUPING METHOD - A server, which is connected to a plurality of terminal devices, includes a device ID (identification) storage unit configured to store device ID items for identifying the terminal devices; an acquiring unit configured to acquire coordinate information that is input at the terminal devices identified by the device ID items and time information that is input; a positional relationship determining unit configured to determine positional relationships between the terminal devices according to a sequence of coordinates that is input in a manner to cross over the terminal devices, based on the coordinate information and the time information; and a group determining unit configured to extract, from determination results of the positional relationship determining unit, the device ID items identifying the terminal devices that are to be grouped together into a group.10-04-2012
20120249418OPTICAL POSITION DETECTION DEVICE, OPTICAL POSITION DETECTION SYSTEM, AND DISPLAY SYSTEM WITH INPUT FUNCTION - In an optical position detection device, the XY coordinates of a target object are detected based on received light intensity of a light receiving section by forming a light intensity distribution, in which the intensity changes in a radiation angle range of detection light, with first and second light source modules. The first and second light source modules are separated from each other in the Z-axis direction, and the position of the target object in the Z-axis direction is detected based on the received light intensity of the light receiving section when forming the light intensity distribution in which the intensity is fixed in the radiation angle range of detection light.10-04-2012
20100007602IMAGE DISPLAY DEVICE - An image display device easily displays a stereoscopically two-dimensional image, improving its direction effect and interactivity. A first display displays a first image on a first screen. An image transmission element in a light path for a display light component of the first image transmits the display light component of the first image, displaying a real image of the first image on an image forming surface positioned at a distance on a side opposite the first screen as a stray image. A second display displays a second image on a second screen as a directly visible image so the stray image is observable from an observation position. A position detecting element outputs a position signal corresponding to detected object position. A control element controls the first or second display according to the output position signal so the stray and/or directly visible image changes correspondingly with the position of the object.01-14-2010
20090309825USER INTERFACE, METHOD, AND COMPUTER PROGRAM FOR CONTROLLING APPARATUS, AND APPARATUS - A user interface for a portable apparatus is disclosed. The user interface comprises a sensor arranged to determine a spatial change, wherein said user interface being arranged to control at least one function, wherein the function is controlled by said determined spatial change; an actuator arrangement; and at least one mass, wherein the actuator arrangement is arranged to controllably actuate at least one of the at least one mass by acceleration to by inertia of the actuated mass provide a force on the portable apparatus. Further, an apparatus, a method, and a computer program for controlling a function are disclosed.12-17-2009
20120188155METHOD AND APPARATUS FOR CONTROLLING DEVICE - A method of controlling a device is provided including identifying a registered device from a screen input by a camera, receiving a user input for the identified device, and transmitting a control command corresponding to the input to the identified device.07-26-2012
20120256820Methods and Systems for Ergonomic Feedback Using an Image Analysis Module - A display device can be used with an ergonomic sensor comprising an imaging device interfaced to processing hardware to obtain and analyze image data depicting a user of the display device. The ergonomic sensor can be preconfigured with data indicating ergonomic uses of the display device so that the image of the user can be analyzed with minimal or no user calibration or setup. Instead, the ergonomic sensor can analyze the image data to provide real-time feedback, such as warnings or suggestions when the user's behavior falls outside an ergonomic use range for the display device. In some implementations, the ergonomic sensor is integrated with the display device, though in other implementations a separate element or preexisting imaging device can be used.10-11-2012
20120256825OPTICAL POSITION DETECTION DEVICE, LIGHT RECEIVING UNIT, AND DISPLAY SYSTEM WITH INPUT FUNCTION - An optical position detection device includes a light receiving section receiving detection light reflected from a target object located in a detectable space through which detection light is radially emitted along an XY plane. The light receiving section includes a light receiving element and a concave mirror. A first cross section (XY cross section) of the reflective surface of the concave mirror is an arc, and a second cross section (YZ cross section) perpendicular to the first cross section is a quadratic curve. Therefore, in the in-plane direction of the XY plane, even light incident from an oblique direction with respect to the light receiving section is reflected by the concave mirror to the light receiving element. In the in-plane direction of the YZ plane, however, the range where the light reaches the light receiving element is limited via the concave mirror.10-11-2012
20120188158WEARABLE ELECTROMYOGRAPHY-BASED HUMAN-COMPUTER INTERFACE - A “Wearable Electromyography-Based Controller” includes a plurality of Electromyography (EMG) sensors and provides a wired or wireless human-computer interface (HCl) for interacting with computing systems and attached devices via electrical signals generated by specific movement of the user's muscles. Following initial automated self-calibration and positional localization processes, measurement and interpretation of muscle generated electrical signals is accomplished by sampling signals from the EMG sensors of the Wearable Electromyography-Based Controller. In operation, the Wearable Electromyography-Based Controller is donned by the user and placed into a coarsely approximate position on the surface of the user's skin. Automated cues or instructions are then provided to the user for fine-tuning placement of the Wearable Electromyography-Based Controller. Examples of Wearable Electromyography-Based Controllers include articles of manufacture, such as an armband, wristwatch, or article of clothing having a plurality of integrated EMG-based sensor nodes and associated electronics.07-26-2012
20120188156OPERATION MEMBER PROVIDED IN ELECTRONIC DEVICE, AND ELECTRONIC DEVICE - An operation member and an electronic device capable of maintaining operability while enhancing cushioning properties provided in the outer surface of an operation member are provided. An operation stick has a cushion portion and a base portion on which the cushion portion is placed. The base portion is supported to be movable. The base portion has a frame portion surrounding the outer periphery of the cushion portion. The base portion and the frame portion are formed of a material having a higher rigidity than that of the material of the cushion portion.07-26-2012
20120188157METHOD AND APPARATUS FOR EVOKING PERCEPTIONS OF AFFORDANCES IN VIRTUAL ENVIRONMENTS - Methods and apparatus are provided for evoking perceptions of affordances in a user/virtual environment interface. The method involves recognizing the absence or inadequacy of certain sensory stimuli in the user/virtual environment interface, and then creating sensory stimuli in the virtual environment to substitute for the recognized absent or inadequate sensory stimuli. The substitute sensory stimuli are typically communicated to the user (e.g., visually and/or audibly) as properties and behavior of objects in the virtual environment. Appropriately designed substitute sensory stimuli can evoke perceptions of affordances for the recognized absent or inadequate sensory stimuli in the user/virtual environment interface.07-26-2012
20120188153MULTI-BEND DISPLAY ACTIVATION ADAPTATION - Systems and methods determine likely unintended flexing of a flexible display and exclude the determined unintended flexings from user input processing. Unintended flexings include placing or removing the flexible display into or out of a compact storage configuration, folds that outside the user's visible area, folds that are near edges and boundaries, flexing with a specified degree of bending or orientation, folds that don't intersect with other folds, folds that are near known unintended folds, folds that have a motion or other variation with time, and folds that are not in proximity to a selectable user interface element. Unintended flexings are adaptively identified by determining that a bend that is not in proximity to a selectable user interface element reoccurs at times when icons at different locations are presented on the flexible display.07-26-2012
20120188154METHOD AND APPARATUS FOR CHANGING A PAGE IN E-BOOK TERMINAL - A method and apparatus for changing a page when an e-book terminal is inclined is provided. The method includes sensing that the e-book terminal is inclined to a left or right side, then changing a current page to a next page when the e-book terminal is inclined to the left side or changing the current page to a previous page when the e-book terminal is inclined to the right side.07-26-2012
20090021477METHOD FOR MANAGING INFORMATION - The invention relates to a method of transferring data from a drawing device, which while utilizing a position-coding pattern, printed on a physical page, digitally records handwritten information, to an application in a computer system. The drawing device transfers recorded data to a memory in the computer system. A registering unit in the system determines from a page from which the recorded data originates and activates, on the basis thereof, one or more applications which are registered as “subscribers” to data from this page. When an application is activated and thus informed of the existence of new data relevant to the application, the application fetches this data. The fetching of data can be made on the basis of the contents of a page description which defines the layout of the physical page.01-22-2009
20090021475METHOD FOR DISPLAYING AND/OR PROCESSING IMAGE DATA OF MEDICAL ORIGIN USING GESTURE RECOGNITION - A method for processing and/or displaying medical image data sets in or on a display device having a screen with a surface, including: detecting gestures performed on or in front of the screen surface; correlating the gestures to predetermined instructional inputs; and manipulating, generating, or retrieving, via computer support, the medical image data sets in response to the instructional inputs.01-22-2009
20090021474SYSTEM AND METHOD FOR DISPLAYING STATUS INFORMATION OF A MULTIMEDIA BROADCAST RECEIVER ON AN AMBIENT DEVICE - There is provided a system for providing status information associated with viewing behavior of media broadcasting. The system comprises a client device that includes a receiver to receive presence data from a remote device and a processor to generate an ambient command based on the presence data. The presence data is associated with broadcast programs of the remote device, and the ambient command represents viewing information of the broadcast program at the remote device. The system also comprises an ambient component that provides an ambient representation of the viewing information based on the ambient command. The ambient component may be an integral part of the client device or a separate component that communicates with the client device via wired or wireless connection.01-22-2009
20090021473Haptic Communication Devices - Embodiments of the invention relate to methods and systems for providing customized “haptic messaging” to users of handheld communication devices in a variety of applications. In one embodiment, a method of using haptic effects to relate location information includes: receiving an input signal associated with a position of a handheld communication device (01-22-2009
20120256826PORTABLE TELEPHONE - In a portable telephone according to the present invention, a display displays a block indicative of an operator, predetermined information and a pointer; the operator can be operated in directions opposite to each other; and the controller controls the display so as to shift the pointer to a desirable position within a predetermined information on a screen of the display in accordance with an operation of the operator and also display a mark indicative of a direction to which the pointer can be shifted and in which the predetermined information exists, adjacently to the block along a shift direction through the operator.10-11-2012
20120319941INTERACTIVE INPUT SYSTEM AND METHOD OF OPERATING THE SAME - A method of operating an interactive input system comprises capturing images of a region of interest at a first frame rate; processing a first pixel subset of images captured at the first frame rate to detect the presence of an object; and if an object is detected, capturing images of the region of interest at a second frame rate.12-20-2012
20120256823TRANSPARENT DISPLAY APPARATUS AND METHOD FOR OPERATING THE SAME - A transparent display apparatus and a method for operating the same are disclosed. The method may include detecting an object located proximate to the transparent image display apparatus; determining a position of the detected object relative to the transparent image display apparatus; and selecting, from among multiple, different augmented object displays associated with the detected object, an augmented object display based on the determined position of the detected object relative to the transparent image display apparatus. A first augmented object display may be selected based on the determined position being a first position relative to the transparent image display apparatus, and a second augmented object display may be selected based on the determined position being a second position relative to the transparent image display apparatus. Display of the selected augmented object display is controlled on the transparent image display apparatus.10-11-2012
20120256824PROJECTION DEVICE, PROJECTION METHOD AND PROJECTION PROGRAM - According to an illustrative embodiment, an information processing apparatus is provided. The apparatus is used for processing a first image projected toward a target. The apparatus includes a processing unit for detecting that an object exists between a projector unit and the target, wherein when an object exists between the projector unit and the target, the apparatus determines an area of the object and generates a modified first image, based on the area of the object, for projection toward the target.10-11-2012
20120256821USER INTERFACE DEVICES, APPARATUS, AND METHODS - User interface devices using magnetic sensing to provide output signals associated with motion and/or deformation of an actuator element of the interface devices are described. The output signals may be provided to an electronic computing system to provide commands, controls, and/or other data or information. In one embodiment, a user interface device may include a plurality of permanent magnets and a plurality of multi-axis magnetic sensors to generate motion and/or deformation signals to be provided to a processing element to generate the output signals.10-11-2012
20120280907Portable Information Processing Device and Media Data Replay System - A portable information processing device (11-08-2012
20120081282ACCESS OF AN APPLICATION OF AN ELECTRONIC DEVICE BASED ON A FACIAL GESTURE - A method of accessing an application of an electronic device based on a facial gesture is disclosed. In one aspect, a method of an electronic device includes capturing an image of a face a user through a camera of the electronic device such that an application of the electronic device is accessible through the electronic device based on the image of the face of the user. The facial gesture of the image of the face of the user of the electronic device is determined to be associated with a user-defined facial gesture. The facial gesture of the image of the face of the user is compared with a designated security facial gesture. An access of the application of the electronic device is permitted when the facial gesture of the image of the face of the user of the electronic device matches the designated security facial gesture.04-05-2012
20120081281INFORMATION DISPLAY APPARATUS FOR MAP DISPLAY - An information display apparatus, including a nonvolatile database memory 04-05-2012
20120081280SINGLE-SCREEN VIEW IN RESPONSE TO ROTATION - A multi-screen user device and methods for controlling data displayed thereby are disclosed. Specifically, a gesture sequence is disclosed which enables a user to toggle or shift though applications that are displayed by the multi-screen user device. The gesture sequence may correspond to various rotation or partial rotations of the multi-screen user device.04-05-2012
20120081279Dynamic Display Adjustment Based on Ambient Conditions - The techniques disclosed herein use a display device, in conjunction with various optical sensors, e.g., an ambient light sensor or image sensors, to collect information about the ambient conditions in the environment of a viewer of the display device. Use of these optical sensors, in conjunction with knowledge regarding characteristics of the display device, can provide more detailed information about the effects the ambient conditions in the viewer's environment may have on the viewing experience. A processor in communication with the display device may create an ambient model based at least in part on the predicted effects of the ambient environmental conditions on the viewing experience. The ambient model may be used to adjust the gamma, black point, white point, or a combination thereof, of the display device's tone response curve, such that the viewer's perception remains relatively independent of the ambient conditions in which the display is being viewed.04-05-2012
20120081278USER INTERFACE WITH SCREEN SPANNING ICON MORPHING - Methods and apparatus for indicating a status of an application that is displayable on one or more displays of a handheld computing device. An icon may be provided that indicates the status and/or potential statuses of the application (e.g., whether the application is expandable and/or expanded). The icon may be changeable between a first state and a second state depending on the status of the application. The change in the icon from the first state to the second state may be animated along with an animated change of the application between display states. As such, a user may observe the icon to determine the status of the application with respect to the one or more displays (e.g., whether the application is expandable, expanded, or expanding).04-05-2012
20120081277MULTI-SCREEN USER INTERFACE WITH ORIENTATION BASED CONTROL - Control of a plurality of displays of a computing device in response to the change in orientation of the computing device. The computing device may be a handheld computing device with a plurality of displays that are concurrently visible by a user. The displays may be capable of displaying a graphical user interface (GUI). The plurality of displays may be modified in response to a change in orientation of the handheld computing device. The modification may include expanding a GUI that is displayed in a single display when in a first orientation to occupy at least two of the plurality of displays in response to the change in orientation.04-05-2012
20120081276PHYSICAL MODEL BASED GESTURE RECOGNITION - A gesture recognition system for recognizing gestures on a mobile device receives sensor data in response to a sensed gesture on the mobile device. The sensor data includes a force or impulse. The force or impulse is applied to the simulated physical object and the state of the simulated physical object is then observed. Input is provided to an application based at least on the observed state of the simulated physical object.04-05-2012
20120081275Media Display Device - A media display device is described. In an embodiment the media display device comprises a display screen and at least one loudspeaker held in a housing rotatably mounted on a lid. For example, in a one handed operation a user is able to rotate the housing to open the device and reveal the display screen which is held upwards using the lid as a stand. For example, the action of opening the device is detected by a sensor and triggers the device to randomly select an item of media content and to display that. For example, images, audio clips, contacts or other items that a user has not opened for some time are presented. The device may randomly select the media type in some embodiments. In an example the sensor is provided by a rotary encoder which also provides part of a hinge for mounting the housing and lid.04-05-2012
20120262368INFORMATION PROCESSING APPARATUS, CONTROL METHOD OF INFORMATION PROCESSING APPARATUS, AND PROGRAM - It enables a user to easily confirm whether or not a process corresponding to a key was correctly performed. To do so, there is provided a control method for controlling an information processing apparatus, comprising: controlling, in a case where, after a key in a screen displayed on a display unit was depressed, the key comes to be not depressed inside a display area of the key, to perform a process corresponding to the key, and, in a case where, after the key in the screen displayed on the display unit was depressed, the key comes to be not depressed outside the display area of the key, to not perform the process corresponding to the key; and notifying, in the case where the key comes to be not depressed outside the display area of the key, the user that the process corresponding to the key is not performed.10-18-2012
20120262367FLEXIBLE ELECTRONIC DEVICE - A flexible electronic device can be operated in multiple operation modes. The flexible electronic device mainly includes a substrate, a flexible display module, at least one folding device, a button module, and a processing unit. The flexible display module is located above the substrate, and the at least one folding device is used for folding the display. When the display is unfolded, a main display is launched on the display. When the display is folded, the main display can be divided into a plurality of sub-display zones such that images or graphics can be displayed on one of the sub-display zones. The button module is located on one of the sub-display zones. The processing unit is electronically connected to the button module in order to operate the flexible electronic device.10-18-2012
20120229371Screen Rotation Lock Methods and Systems - Screen rotation lock methods and systems are provided. First, an angle between a specific plane of an electronic device and an absolute horizontal plane is detected using at least one sensor, and it is determined whether the angle equals to a specific angle. When the angle equals to the specific angle, a screen auto-rotation function of the electronic device is disabled.09-13-2012
20120229370System, Method and Device for Presenting Different Functional Displays When Orientation of the Device Changes - Different functional views for a mobile device are provided depending on orientation of the device. The mobile device includes an enclosure and a display disposed within the enclosure, wherein the display presents a functional view to a user when the device is positioned in a first orientation and a second functional view when the display is rotated to a second orientation.09-13-2012
20120229374ELECTRONIC DEVICE - An electronic device and methods are disclosed. The electronic device comprises a plurality of display screens, an operation module, and a control module. The operation module is operable to be operated by a user, and the control module is electrically coupled to the display screens and the operation module, and comprises an operation display module. The operation display module is operable to display an active application from among a plurality of applications running on a first display screen from among the display screens. The operation display module is further operable to display operable information related to the active application on a second display screen, when the operation module is operated by the user.09-13-2012
20120229376INPUT DEVICE - An input device displays a screen including character input buttons to each of which characters are assigned, and each of which specifies one character assigned thereto, as a character to be inputted, according to the number of times that a selection operation is performed thereon, and display area B in which a character to be inputted specified through a selection operation performed on the character input buttons is displayed in order on a display device, specifies candidate characters each of which can be inputted to follow a character string being inputted and displayed in the display area B from the character strings of words each including the character string being inputted, and, when a character to be inputted which is specified to follow the character string is not included in the candidate characters, tones down and displays the character to be inputted immediately after the character string.09-13-2012
20120229372STORAGE MEDIUM HAVING INFORMATION PROCESSING PROGRAM STORED THEREON, INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD - Data which is output from an input device capable of allowing a body of a user to get thereon and is based on a load applied on the input device is usable. Data is acquired from the input device is processed. A center-of-gravity position of a load applied on the input device is repeatedly acquired based on data output from the input device. A user direction a stepping action made on the input device is calculated using the center-of-gravity position. Predetermined processing is performed based on the user direction.09-13-2012
20110122060OPTICAL NAVIGATION DEVICE - An optical navigation device that can sense the movement of an object, such as a user's finger, so that the movement can control a feature of a consumer digital device such as a cursor on a display screen. The device includes a substrate to which an LED, reflector, and image sensor are attached. Light from the LED is directed by the elliptical reflector toward and through a window that is transparent to the light from the LED and then is reflected off of the user's finger back through the window, through a lens, and onto the image sensor. The reflector is positioned to direct light toward the window at an oblique angle, in the range of 65 to 70 degrees from an angle normal to the window. Further, the reflector is curved to gather light across a large solid angle in the vicinity of the LED. The curved shape of the reflector may be a portion of an ellipsoid and the LED may be located at one of the foci of the ellipsoid, with the window located at the other foci of the ellipsoid.05-26-2011
20110122059FINGER SENSING APPARATUS WITH SELECTIVELY OPERABLE TRANSMITTING/RECEIVING PIXELS AND ASSOCIATED METHODS - A finger sensing device may include an integrated circuit (IC) substrate and an array of pixels on the IC substrate. Each pixel may be selectively operable in at least a receiving mode for receiving radiation from an adjacent finger, or a transmitting mode for transmitting radiation into the adjacent finger. The finger sensing device may also include a controller coupled to the array of pixels for selectively operating at least one first pixel in the receiving mode, and while selectively operating at least one second pixel in the transmitting mode. Each pixel may also be selectively operable in a mask mode for neither receiving nor transmitting radiation. The controller may also selectively operate at least one third pixel in the mask mode while selectively operating the at least one first and second pixels in the receiving and transmitting modes.05-26-2011
20100328202INFORMATION DISPLAY DEVICE, INFORMATION DISPLAY METHOD, AND PROGRAM - An information display device, an information display method, and a program displays, when displaying definition information of an instance method selected by a user, definition information that corresponds to contents of processing of the instance method during runtime. The information display device displays definition information of an instance method selected by a user; wherein the instance method is included in a type that can be referenced during runtime by a variable that references a receiver object of the instance method.12-30-2010
20100328201Gesture Based User Interface Supporting Preexisting Symbols - A motion controlled handheld device includes a display having a viewable surface and operable to generate an image and a gesture database maintaining a plurality of gestures. Each gesture is defined by a motion of the device with respect to a first position of the device. The gestures comprise symbol gestures each corresponding to a character from a preexisting character set. The device includes an application database maintaining at least one application and a gesture mapping database comprising a gesture input map for the application. The gesture input map comprises mappings of the symbol gestures to corresponding inputs for the application. The device includes a motion detection module operable to detect motion of the handheld device within three dimensions and to identify components of the motion in relation to the viewable surface. The device also includes a control module operable to load the application, to track movement of the handheld device using the motion detection module, to compare the tracked movement against the symbol gestures to identify a matching symbol gesture, to identify, using the gesture input map, the corresponding input mapped to the matching symbol gesture, and to provide the corresponding input to the application.12-30-2010
20100328200Device and related method for converting display screen into touch panel screen - A device for converting display screen into touch panel screen includes a projection screen; an image projection unit for projecting an image on the projection screen; a light pointer for emitting a light spot signal of a specific wavelength on the projection screen; an image detecting unit, which includes a fisheye lens for receiving the image on the projection screen to generate a fisheye distorted image; and an optical filter for filtering out the optical energy except the light spot signal of the specific wavelength; an image processing unit, coupled to the image detecting unit, for calculating a first position of the light spot on the projection screen according to the fisheye distorted image; and a message transmitting unit, coupled to the image processing unit, for outputting a touch panel signal according to the calculating result of the image processing unit.12-30-2010
20080297471GESTURE RECOGNITION METHOD AND TOUCH SYSTEM INCORPORATING THE SAME - A gesture recognition method includes detecting multiple pointers in close proximity to a touch surface to determine if the multiple pointers are being used to perform a known gesture. When the multiple pointers are being used to perform a known gesture, executing a command associated with the gesture. A touch system incorporating the gesture recognition method is also provided.12-04-2008
20080297470ELECTRONIC DOCUMENT READERS AND READING DEVICES - This invention generally relates to an electronic document readers and reading devices, that is to a device that presents a document to a user on a display to enable the user to read the document. An electronic document reading device configured for one hand operation, the device including: a housing; an electroactive display mounted in said housing; control electronics coupled to said display; at least one user control coupled to said control electronics for operating said device; and a rechargeable power source configured to power said control electronics and said display; and wherein said housing has a width to fit at least on the palm of an adult human hand, said width being less than approximately 120 mm, and wherein said housing has a length of at least twice said width, and wherein said control electronics and said rechargeable power source are disposed within said housing so as to provide a centre of mass of said device which, when a lower part of said device is held in a said palm, is located at a distance of no greater than 50% of said length from a lower end of said device.12-04-2008
20080291161FORCE REFLECTING HAPTIC INTERFACE - A multi-function force reflecting haptic interface including various sub-assemblies is disclosed. The sub-assemblies include multiple function user interfaces, a user interface docking station for setting the interface to a home position, temperature monitoring and control systems, and various kinematic cable drive systems.11-27-2008
20100053079REMOTE CONTROL SYSTEM INCLUDING A DISPLAY PANEL AND A TERMINAL FOR REMOTE-CONTROLLING THE DISPLAY PANEL, AND REMOTE CONTROL METHOD IN THE REMOTE CONTROL SYSTEM - Disclosed are a remote control method in a remote control system including a display panel, and a terminal for remote-controlling the display panel, and the remote control system. The remote control system includes a terminal for transmitting a light signal to a point on the screen of a display panel according to user input, and a display panel for executing an operation corresponding to the light signal received point.03-04-2010
20120326971PORTABLE INFORMATION TERMINAL AND SCREEN DISPLAY CONTROL METHOD - A portable information terminal includes: a first housing including display section; a second housing connected to the first housing, an open angle that is an angle formed by the second housing and a plane of display section being changeable; angle detection section whose output value changes according to a operation in which the first housing and the second housing are opened or closed; and control section that causes display section to execute a first display process if the output value of angle detection section is equal to or greater than a threshold and to execute a second display process if the output value is less than the threshold. One of the first display process and the second display process includes a process that causes information displayed on display section to become illegible compared to information displayed on display section that executes the other display process.12-27-2012
20120326972MULTI-TASK INTERACTIVE WIRELESS TELECOMMUNICATIONS DEVICE - A portable wireless telecommunications device has electronic computer visual display includes at least one central display screen and at least two additional screens respectively disposed foldably to the right and left of said central screen, for displaying simultaneous, multiple images to a user in a super video graphics array (SVGA). The foldable display has at least two mutually connected foldable sub-display units, and it includes a user attachable and detachable connector, for user-assembling of the sub-display units into mutual connection with each other. Each of the two sub-display units have user-deployable supports (e.g. a rigid angular support member) for maintaining the sub-display units in an upwardly projecting disposition during use.12-27-2012
20120326976Directed Performance In Motion Capture System - Techniques for enhancing the use of a motion capture system are provided. A motion capture system tracks movement and audio inputs from a person in a physical space, and provides the inputs to an application, which displays a virtual space on a display. Bodily movements can be used to define traits of an avatar in the virtual space. The person can be directed to perform the movements by a coaching avatar, or visual or audio cues in the virtual space. The application can respond to the detected movements and voice commands or voice volume of the person to define avatar traits and initiate pre-scripted audio-visual events in the virtual space to provide an entertaining experience. A performance in the virtual space can be captured and played back with automatic modifications, such as alterations to the avatar's voice or appearance, or modifications made by another person.12-27-2012
20120326970ELECTRONIC DEVICE AND METHOD FOR CONTROLLING DISPLAY OF ELECTRONIC FILES - An electronic device displays electronic files on a display device. When a user views the electronic device, a video camera captures a real-time video consisting of a plurality of frames of the user. The electronic device recognizes a face region in each frame and a lip outline in the face region of the frame, and generates a lip shape variation video of a lip of the user according to the lip outline in each frame and a capturing time of each frame. Furthermore, the electronic device searches a preset lip-language video that are pre-stored in a storage device and matches the lip shape variation video, and controls display of the electronic files by executing an voice command associated with the matched preset lip-language video.12-27-2012
20120326969IMAGE SLIDESHOW BASED ON GAZE OF A USER - Provided is a method to enable an image slideshow based on gaze of a user. The method displays an image to a user for viewing, wherein, invisible to the user, each image is divided into a grid of tiles. While the user is viewing the image, the method detects the gaze of the user to identify regions of the image that are of interest to the user. The identified regions of interest are mapped to the grid of tiles on the image, to recognize tiles of interest to the user. The tiles of interest for the image are calculated and another image is presented to the user for viewing when the number of tiles of interest exceeds a threshold value previously computed for the user.12-27-2012
20120326968CONTROL APPARATUS AND METHOD, RECORDING MEDIUM AND PROGRAM - The present invention relates to a control apparatus and a method, a recording medium and a program, which enable to control at least one first device more efficiently and quickly through the use of a second device. The control apparatus detects the at least one first device, requests at least one first operation panel information corresponding to the at least one first device from the second device; displays the at least one first operation panel; and controls the at least one first device.12-27-2012
20120326966GESTURE-CONTROLLED TECHNIQUE TO EXPAND INTERACTION RADIUS IN COMPUTER VISION APPLICATIONS - The invention describes a method and apparatus to expand a radius of the interaction in computer vision applications with the real world within the field of view of a display unit of a device. The radius of interaction is expanded using a gesture in front of an input sensory unit, such as a camera that signals the device to allow a user the capability of extending and interacting further into the real and augmented world with finer granularity. In one embodiment, the device electronically detects the gesture generated by the user's extremity as obtained by the camera coupled to the device. In response to detecting the gesture, the device changes the shape of a visual cue on the display unit coupled to the device, and updates the visual cue displayed on the display unit.12-27-2012
20120326967VEHICULAR GLANCE LIGHTING APPARATUS AND A METHOD FOR CONTROLLING THE SAME - The present invention provides a vehicular glance lighting apparatus which provides a driver with vehicle-running information as light of a background screen and includes a driver attention function through prompt information delivery such that the driver visually recognizes as a background screen the vehicle-running information needed essentially or minimally for running the vehicle without requiring the driver's attention to avoid dissipation of the driver's sight, thereby preventing the driver's driving attention from being diverted, and a method for controlling the same. The vehicle-running information related with the running of a vehicle to be outputted in the form of light as a background screen, so that a driver can perceive the vehicle-running information without diverting or obstructing the driver's driving attention, thereby enhancing the driver's ability to cope with a vehicle travel risk during the traveling of the vehicle.12-27-2012
20120326964INPUT DEVICE AND COMPUTER-READABLE RECORDING MEDIUM CONTAINING PROGRAM EXECUTED BY THE INPUT DEVICE - A character input device, including: a display control section to display, in a first region, an operational-element group composed of operational elements corresponding to characters and to display, in a second region, another operational-element group composed of operational elements corresponding to characters, the characters corresponding to the respective operational-element groups displayed in the first and the second regions being different in type; a first input processing section to perform, upon detection of an operation on the first region, input processing of a character specified by the operation, among the characters to which the operational elements of the operational-element group displayed in the first region correspond; and a second input processing section to perform, upon detection of an operation on the second region, input processing of a character specified by the operation, among the characters to which the operational elements of the operational-element group displayed in the second region correspond.12-27-2012
20120326965METHODS AND APPARATUS FOR PROCESSING COMBINATIONS OF KINEMATICAL INPUTS - Methods and apparatus for processing combinations of force and velocity data generated over a given period of time. In one embodiment, an input device comprising one or more force sensors and one or more motion sensors is manipulated relative to a surface. A receiving system is adapted to receive input sequences or “gestures” which are triggered upon the occurrence of one or more conditions detected by the input device. An application executing in the receiving system may be implemented such that the system responds differently to each specific gesture provided by the user.12-27-2012
20120326963FAST FINGERTIP DETECTION FOR INITIALIZING A VISION-BASED HAND TRACKER - Systems and methods for initializing real-time, vision-based hand tracking systems are described. The systems and methods for initializing the vision-based hand tracking systems image a body and receive gesture data that is absolute three-space data of an instantaneous state of the body at a point in time and space, and at least one of determine an orientation of the body using an appendage of the body and track the body using at least one of the orientation and the gesture data.12-27-2012
20120326960SCANNING TECHNOLOGY - Scanning technology, in which one or more electronic communications that designate a scan area of an object are received from an input apparatus. A preview scan area is displayed based on the received one or more electronic communications that designate the scan area of the object.12-27-2012
20120326958DISPLAY AND USER INTERFACE - A touchless method for registering commands from a display (e.g. reconfigurable display) may include any of various components. The method may use a light sensor in front of or behind the display to detect light reflected by a user's finger approaching a control option displayed on the display. Light used to display images may be provided at a frequency and/or time that can be identified by a processor connected to the light sensor, or can possess some other unique property (e.g. color) which may be distinguished by the processor.12-27-2012
20110037692APPARATUS FOR DISPLAYING AN IMAGE AND SENSING AN OBJECT IMAGE, METHOD FOR CONTROLLING THE SAME, PROGRAM FOR CONTROLLING THE SAME, AND COMPUTER-READABLE STORAGE MEDIUM STORING THE PROGRAM - The present invention includes: a display/optical sensor section (02-17-2011
20110037691THREE-DIMENSIONAL OBJECT DISPLAY CONTROL SYSTEM AND METHOD THEREOF - Provided is a three-dimensional object display control system that enables a three-dimensional object displayed on a screen to be freely rotated by means of instinctive operation. A computer system (02-17-2011
20110037690ELECTRONIC ALBUM - An electronic album for presenting media content comprises a cover member and a frame member. The cover member and the frame member are hingedly coupled to each other to facilitate opening and closing of the electronic album. The electronic album further comprises a data input port, a user interface unit, a processor, and a display screen. The data input port, the user interface unit and the display screen are integrated in the frame member and are electrically connected to the processor. The data input port is capable of receiving data associated with the media content. Further, the user interface unit is capable of receiving at least one user input thereon and the processor is configured to process the data based on the at least one user input. The display screen is configured to present the media content based on the processed data.02-17-2011
20120268360User Identified to a Controller - Methods, systems, and computer programs for configuring a computer program based on user are provided. One method includes an operation for detecting, by a controller, an object carried by a user, where the object includes a parameter value—e.g., a radio-frequency identification (RFID) tag—that uniquely identifies the object from a plurality of objects. The parameter value is transmitted to a computing device executing the computer program, and the computer program determines if the computer program has user information associated with the transmitted parameter value. The computer program is configured utilizing the user information, when the computer program has the user information for the parameter value.10-25-2012
20120268368Method, Apparatus, and Article for Force Feedback Based on Tension Control and Tracking Through Cables - A haptic interface system includes a cable based haptic interface device and a controller. The controller receives information related to movement of a grip in real-space and generates a stereoscopic output for a display device. The stereoscopic output includes images of a virtual reality tool whose motions mimic motions of the real-space grip.10-25-2012
20120268359CONTROL OF ELECTRONIC DEVICE USING NERVE ANALYSIS - An electronic device may be controlled using nerve analysis by measuring a nerve activity level for one or more body parts of a user of the device using one or more nerve sensors associated with the electronic device. A relationship can be determined between the user's one or more body parts and an intended interaction by the user with one or more components of the electronic device using each nerve activity level determined. A control input or reduced set of likely actions can be established for the electronic device based on the relationship determined.10-25-2012
20120268365METHOD, SYSTEM, AND APPARATUS FOR CONTROLLING LIGHT - Methods, systems and apparatuses for controlling light. A method of controlling light includes displaying on a display unit at least one light property representing a property of the light of a lighting apparatus; if a user command for selecting a property value of the light property is input, displaying a lighting state corresponding to the selected property value; and if a user command representing that selection is completed is input, determining the displayed lighting state as a lighting state of the lighting apparatus.10-25-2012
20120268366Method and Device for Visual Compensation10-25-2012
20120326962Data Processing Device - When a first device and a first attribute are specified, a data processing device registers the first device in association with first attribute and transmits an instruction to perform a process relating to the first attribute to the first device. If the first device possesses the second attribute, the data processing device registers the first device in association with the second attribute. The data processing device displays, on a display unit, a first image representing identification data of the first device when the second attribute is specified and the first device is registered in association with the second attribute. When the first image is selected, the data processing device transmits an instruction to perform a process relating to the second attribute to the first device corresponding to the selected first image.12-27-2012
20120326959REGION OF INTEREST SEGMENTATION - A sensor manager provides dynamic input fusion using thermal imaging to identify and segment a region of interest. Thermal overlay is used to focus heterogeneous sensors on regions of interest according to optimal sensor ranges and to reduce ambiguity of objects of interest. In one implementation, a thermal imaging sensor locates a region of interest that includes an object of interest within predetermined wavelengths. Based on the thermal imaging sensor input, the regions each of the plurality of sensors are focused on and the parameters each sensor employs to capture data from a region of interest are dynamically adjusted. The thermal imaging sensor input may be used during data pre-processing to dynamically eliminate or reduce unnecessary data and to dynamically focus data processing on sensor input corresponding to a region of interest.12-27-2012
20120326975INPUT DEVICE AND INPUT METHOD - The present invention discloses an input device and an input method. The input device comprises: an image acquiring device for acquire sequence of images when an object moves in front of the field of view of the image acquiring device; and a processor for generating a rotation signal according to the rotation status of the object.12-27-2012
20120326973DISPLAY DEVICE - A portable phone (12-27-2012
20120319937PRESSURE DETECTING USER INPUT DEVICE - A pressure sensitive device and operating method thereof that detects a location and magnitude of an applied pressure. The device has a base and a deformable surface that enclose a deformable material that has an electrical impedance. Impedance measurements are made at multiple points along the base and the deformable surface. A contour of the deformable surface, caused by the applied pressure, is estimated. The position of the pressure relative to the center of the device is estimated based on contour and is interpreted as a user input specifying a direction. The amount of applied pressure is also estimated based on contour and is interpreted as a user input specifying a magnitude. The user input specifying location direction and magnitude is provided to processing for any application, such as controlling a user interface.12-20-2012
20100026623VELOCITY STABILIZATION FOR ACCELEROMETER BASED INPUT DEVICES - A method and apparatus for reducing or eliminating tracking errors associated with accelerometer-based input devices. The method and apparatus calculates a velocity of the input device; determines if the calculated velocity indicates a motion tracking error; and ignores the calculated velocity if the motion tracking error is indicated.02-04-2010
20100194678DIAGONAL MOVEMENT OF A TRACKBALL FOR OPTIMIZED NAVIGATION - A handheld communication device including a trackball-based cursor navigation tool includes a display screen, a plurality of trackball roll-direction detectors, each engaging the trackball and primarily actuated by one of a vertical, horizontal, and diagonal rotation of the trackball relative to the display screen. The roll-direction detectors can each have a roller in contact with the trackball and rotatable about a corresponding rotational axis and can each also have a sensor for sensing the rotation of the trackball and outputting a signal representative of the amount of rotation of the trackball. A microprocessor receives the signals output from the roll-direction detectors, processes the signals to determine whether primarily vertical, horizontal, or diagonal movement has been detected, and outputs corresponding control signals to a screen cursors control to affect the movement of a cursor on the display screen.08-05-2010
20120319942DISPLAY APPARATUS FOR SETTING REMOTE CONTROLLER DEVICE AND DISPLAYING METHOD THEREOF - A display apparatus, a displaying method thereof, and a remote controller device are provided, the display method including: a displaying method of a display apparatus which is connected to at least one external device and which communicates with a remote controller, the displaying method including: acquiring selection conditions for the at least one external device connected to the display apparatus; and if an external device from among the at least one external device is selected according to the selection conditions, displaying a first user interface (UI) comprising a setting code for setting the selected external device as a control object of the remote controller device, wherein the setting code comprises code values corresponding to interface types of the display apparatus and the selected external device.12-20-2012
20120319944CONTROL SYSTEM EQUIPPED WITH PROGRAMMABLE DISPLAY, PROGRAMMABLE DISPLAY, AND DRAWING DATA GENERATION MEANS - A control system including, a programmable display, and an external device that is connected to the programmable display, wherein the external device stores specific display control information into a first device for each display designation information in correspondence with the display designation information; and wherein the programmable display includes a first memory block that stores the display designation information and all display-specific communication setting information, and a control block that makes an access to the first device of the external device according to the display designation information and the all display-specific communication setting information, to thus acquire its specific display control information corresponding to the display designation information, that stores the acquired specific display control information into a second memory block, and that controls the programmable display according to the specific display control information stored in the second memory block.12-20-2012
20100231508Systems and Methods for Using Multiple Actuators to Realize Textures - Systems and methods for using multiple actuators to realize textures are disclosed. For example, one disclosed system includes, a system including: a first actuator configured to receive a first haptic signal and output a first haptic effect based at least in part on the first haptic signal; a second actuator configured to receive a second haptic signal and output a second haptic effect based at least in part on the second haptic signal; and a processor configured to: determine the first haptic effect and the second haptic effect, the first and second haptic effects configured when combined to output a texture; and transmit the first haptic signal to the first actuator and transmit the second haptic signal to the second actuator.09-16-2010
20100231507METHOD AND APPARATUS FOR PROVIDING CONTENT AND METHOD AND APPARATUS FOR DISPLAYING CONTENT - Provided are a method and apparatus for providing digital content and a method and apparatus for displaying digital content. In the displaying method, method of displaying content in a terminal, situational information including at least one of information regarding a user of the terminal and information regarding an external environment is collected, whether the collected situational information conforms to display conditions of the content is determined, and then, the content is selectively displayed based on a result of the determining.09-16-2010
20100231506CONTROL OF APPLIANCES, KITCHEN AND HOME - The disclosed invention is generally in the field of control of appliances in the home, and in their networking and connectivity also with audio systems and internet sources and the integration of these elements in a connected manner. Preferred apparatus generally employs a video projection system and one or more TV cameras. Embodiments of the invention may be used to enhance the social interaction and enjoyment of persons in the kitchen and reduce the work of food preparation. The invention may be used in many rooms of the house, and contribute to the well being of seniors and others living therein.09-16-2010
20100231510DUAL FILM LIGHT GUIDE FOR ILLUMINATING DISPLAYS - A front light guide panel including a plurality of embedded surface features is provided. The front light panel is configured to deliver uniform illumination from an artificial light source disposed at one side of the font light panel to an array of display elements located behind the front light guide while allowing for the option of illumination from ambient lighting transmitted through the light guide panel. The surface embedded surface relief features create air pockets within the light guide panel. Light incident on the side surface of the light guide propagates though the light guide until it strikes an air/light material guide interface at one on the air pockets. The light is then turned by total internal reflection through a large angle such that it exits an output face disposed in front of the array of display elements.09-16-2010
20100231509Sterile Networked Interface for Medical Systems - One embodiment of a sterile networked interface system is provided comprising a hand-held surgical tool and a data processing system. The surgical tool includes a sensor for sensing a physical variable related to the surgery, a wireless communication unit to transmit the physical variable to the data processing system, and a battery for powering the hand-held surgical tool. The surgical tool sends the physical variable and orientation information responsive to a touchless gesture control and predetermined orientation of the surgical tool. Other embodiments are disclosed.09-16-2010
20100231505INPUT DEVICE USING SENSORS MOUNTED ON FINGER TIPS - In order to allow key input without operation of a keyboard or allow click and drag operations by finger tips of the hands, the present invention is primarily characterized in that a user wears a glove (09-16-2010
20100231503CHARACTER INPUT SYSTEM, CHARACTER INPUT METHOD AND CHARACTER INPUT PROGRAM - The grouping unit assigns an m-digit value expressed by an n-ary notation to each input candidate character one-for-one to classify the respective input candidate characters into n groups on a basis of each of the m digits. The group displaying unit causes the display device to display, in the lump on a group basis, the input candidate characters classified on a digit basis. Among n selection keys corresponding to the respective groups, the input device has one key operated by a user to output information indicating which selection key is operated (information indicative of a selection key operated) to the character structuring unit. The character structuring unit determines a character according to information input from the input device.09-16-2010
20110248910METHOD FOR COMPARING TWO CONTINUOUS COMPUTER GENERATED LINES GENERATED BY THE MOVEMENTS OF A COMPUTER MOUSE OR A DIGITIZER TABLET - A method is provided for comparing two continuous computer generated lines by using a combination of time, area, velocity and angular velocity and to use a weighted distribution of 10 for time, 40 for area, 40 for velocity and 10 for angular velocity. The method returns true if the sum of the product of the weighted distribution with the percentage difference for time, area, velocity, and angular velocity is greater than or equal to 80, otherwise, the method returns false.10-13-2011
20120319945SYSTEM AND METHOD FOR REPORTING DATA IN A COMPUTER VISION SYSTEM - Embodiments of the present invention disclose a system and method for reporting data in a computer vision system. According to one embodiment, the presence of an object is detected within a display area of a display panel via at least one three-dimensional optical sensor. Measurement data associated with the object is received, and processor extracts at least one set of at least seven three-dimensional target coordinates from the measurement data. Furthermore, a control operation for the computer vision system is determined based on the at least one set of target coordinates.12-20-2012
20120319940Wearable Digital Input Device for Multipoint Free Space Data Collection and Analysis - A new computer wearable input device, referred to as Imagine, may be used to control electronic devices in a natural, intuitive, convenient and comfortable manner, having a form factor which does not impede normal daily or business activities. For example, an Imagine may serve as an alternative to input devices such as a mouse, keyboard, or game controller. An Imagine device is able to recognize complex gestures, such as a person signing American Sign Language. An Imagine device may include a plurality of motion sensors affixed to a user's fingers and a plurality of motion sensors affixed to a user's wrists, a processing component and a communication component designed to communicate with a second electronic device.12-20-2012
20120319938HAPTIC THEME FRAMEWORK - A haptic theme system is provided that can create a haptic theme, where a haptic theme is an installable package that includes one or more haptic effects, and a mapping of the one or more haptic effects to one or more user interface (“UI”) events of a device. The haptic theme can be installed on the device, and the device can then dynamically load and play a haptic theme in real-time. The haptic theme system can display one or more haptic themes within a user interface. Upon receiving a selection, the haptic theme system can generate haptic feedback based on the haptic effect that is mapped to a received user interface event within the mapping.12-20-2012
20120319943INFORMATION PROCESSING DEVICE, DISPLAY CONTROL METHOD, PROGRAM AND RECORDING MEDIUM - [Issues] An aim of the present invention is to provide information processing apparatus, display control method, program and computer-readable recording medium to improve the operabilities in accordance with the open/close angle of the device.12-20-2012
20120326961GESTURE BASED USER INTERFACE FOR AUGMENTED REALITY - Technologies are generally described for systems and methods effective to provide a gesture keyboard that can be utilized with a virtual display. In an example, the method includes receiving sensory information associated with an object in proximity to, or in contact with, an input device including receiving at least one level of interaction differentiation detected from at least three levels of interaction differentiation, interpreting a command from the sensory information as a function of the at least one level of interaction differentiation, and outputting an action indication based on the command.12-27-2012
20120326974TERMINAL DEVICE, METHOD FOR SETTING SAME, AND COMMUNICATION SYSTEM - A terminal device includes reading means and setting means. The reading means reads setup information added to image data from the image data. The setting means implements a setting on the basis of the setup information read by the reading means.12-27-2012
20120326977Hybrid Control Of Haptic Feedback For Host Computer And Interface Device - A hybrid haptic feedback system in which a host computer and haptic feedback device share processing loads to various degrees in the output of haptic sensations, and features for efficient output of haptic sensations in such a system. A haptic feedback interface device in communication with a host computer includes a device microcontroller outputting force values to the actuator to control output forces. In various embodiments, the microcontroller can determine force values for one type of force effect while receiving force values computed by the host computer for a different type of force effect. For example, the microcontroller can determine closed loop effect values and receive computed open loop effect values from the host; or the microcontroller can determine high frequency open loop effect values and receive low frequency open loop effect values from the host. Various features allow the host to efficiently stream computed force values to the device.12-27-2012
20120319939PARAMETER INPUT APPARATUS AND IMAGE FORMING APPARATUS - A parameter input apparatus for inputting a parameter, including: a control portion configured to display, in a display portion, an operational screen for inputting the parameter; and an operating portion for operating the operational screen, wherein the control portion is configured to display, in the display portion, the operational screen that includes a target area which is a target of an operation by the operating portion and is configured such that a view of the target area and a view of the operating portion are made uniform when the operational screen is displayed.12-20-2012
20120092251OPERATION SYSTEM FOR VEHICLE - An operation system for a vehicle includes, a display unit; a proximity detecting unit that detects that a part of a human body approaches the display unit; a position specifying unit that specifies positions on the display unit where the part of the human body approached on the basis of a detection result of the proximity detecting unit; and a display controlling unit that makes an icon display screen, which displays icons corresponding to respective operation items of a plurality of in-vehicle devices, and a selection screen, which selects set values or operation modes of the operation items corresponding to the icons which are displayed at the positions when the positions are specified by the position specifying unit, are displayed on the display unit; wherein the icons are formed of at least characters or figures representing the set values or the operation modes that are currently selected, and wherein the display controlling unit makes the icon display screen, on which all operable icons are displayed, be displayed on one screen of the display unit.04-19-2012
20120092250FINGER-OPERATED INPUT DEVICE - The finger-operated input device connectable to a digital device is disclosed. The aforesaid input device comprises (a) a sensing pad being responsive to displacement of the finger substantially parallel to the pad in a reach range thereof and transmitting electrical signals in response thereto; and (b) an analyzing unit arranged to receive said electrical signals and calculate a data defining trajectory of the finger and transmit the data to the digital device; The sensing pad comprises at least three sensors perceptive to motion of the finger relative to the sensors.04-19-2012
20120092249Accessing Accelerometer Data - Systems and processes for accessing acceleration data may include an accelerometer coupled to a nonvolatile memory. The nonvolatile memory may be coupled to a processor. Acceleration data may be obtained from the accelerometer via a bus coupling the nonvolatile memory to the accelerometer. Acceleration data may be sent from the nonvolatile memory to a processor. One or more operations may be performed based on the acceleration data.04-19-2012
20120092248 METHOD, APPARATUS, AND SYSTEM FOR ENERGY EFFICIENCY AND ENERGY CONSERVATION INCLUDING DYNAMIC USER INTERFACE BASED ON VIEWING CONDITIONS - In general, in one aspect, a viewing configuration detector collects data for a viewing area of a consumer electronics device and determines viewing configuration based on the collected data. A dynamic user interface controller determines a perceived appropriate configuration for the content presented on the consumer electronics device based on the viewing configuration and if necessary modifies the configuration of the content. The modifying is done without receiving input to make the change from a user and may be done to conserve power or enhance user experience. The viewing configuration detector may include a light transmitter to transmit light in direction of the user and a light receiver to receive reflected light and determine distance and/or location of user based theeron. The viewing configuration detector may include a camera to capture images of the viewing area and image recognition functionality to detect the user and different attributes associated therewith.04-19-2012
20120092247COMPUTER INPUT DEVICE - Computer input devices are described herein for use in manipulating digital images on a display apparatus.04-19-2012
20120092246ELECTRONIC APPARATUS AND CONTROL METHOD THEREOF, AND EXTERNAL APPARATUS AND CONTROL METHOD THEREOF - An electronic apparatus and a control method thereof, and an external apparatus and a control method thereof. The electronic apparatus includes a communication unit which communicates with an external apparatus, and a controller which performs an operation corresponding to a key signal received through the communication unit, receives a setup signal through the communication unit to reset up an operation corresponding to a first key signal, and performs a second operation based on a result of the reset up when the first key signal is received through the communication unit. The setup signal sets up the first operation corresponding to the first key signal to be replaced by the second operation different from the first operation.04-19-2012
20120092245MOBILE DEVICE WITH ROTATABLE PORTION FOR INPUT MECHANISM - Disclosed herein is a mobile device comprising a base providing a display and a rotatable portion providing an input mechanism. The rotatable portion is rotatable about a coupling between at least a first position in which the rotatable portion covers a part of the display and a second position in which the rotatable portion exposes more of the display while leaving the input mechanism accessible to a user. In at least one variant embodiment, the rotatable portion may be rotatable to a third position “behind” the base with the input mechanism remaining accessible to the user.04-19-2012
20120092244ELECTRONIC CONTROL MODULE INTERFACE SYSTEM FOR A MOTOR VEHICLE - An electronic control module interface system may include a sensing surface having a symbol representative of a function of a motor vehicle, a sensing device coupled to the sensing surface, configured to sense a proximity between a physical pointer and the sensing surface and to generate a browsing signal when the proximity is less than a predefined proximity threshold value, and configured to generate a selection signal when the physical pointer contacts or depresses the sensing surface, and an output device coupled to the sensing device, and configured to produce an output representative of the function of the motor vehicle upon receiving the browsing signal.04-19-2012
20120287038Body Scan - A depth image of a scene may be received, observed, or captured by a device. The depth image may then be analyzed to determine whether the depth image includes a human target. For example, the depth image may include one or more targets including a human target and non-human targets. Each of the targets may be flood filled and compared to a pattern to determine whether the target may be a human target. If one or more of the targets in the depth image includes a human target, the human target may be scanned. A skeletal model of the human target may then be generated based on the scan.11-15-2012
20120287039USER INTERFACE FOR APPLICATION SELECTION AND ACTION CONTROL - Example embodiments disclosed herein relate to a computing device including a processor and a machine-readable storage medium, which may include instructions for displaying a first interface area in a user interface, the first interface area including a plurality of application selection controls, each corresponding to an application accessible to the computing device. The storage medium may further include instructions for displaying a second interface area in the user interface, the second interface area including a plurality of action controls, wherein each action control is associated with a function of the application corresponding to a currently-selected application selection control. Finally, the storage medium may include instructions for displaying a third interface area in the user interface, the third interface area comprising an interface of the application corresponding to the currently-selected application selection control. Example methods and machine-readable storage media are also disclosed.11-15-2012
20090109173Multi-function computer pointing device - A first aspect of the present invention includes an N-persistent-mode pointing device and a 2N-mixed-mode pointing device. A second aspect of the present invention is an unconventional method for generating target-motion signals with common motion controls. This method utilizes a qualification-rule-based dynamic mapping of the present invention. A third aspect of the present invention combines the multi-mode designs and the dynamic mapping.04-30-2009
20120287031PRESENCE SENSING - One embodiment may take the form of a method of operating a computing device to provide presence based functionality. The method may include operating the computing device in a reduced power state and collecting a first set of data from a first sensor. Based on the first set of data, the computing device determines if an object is within a threshold distance of the computing device and, if the object is within the threshold distance, the device activates a secondary sensor to collect a second set of data. Based on the second set of data, the device determines if the object is a person. If the object is a person, the device determines a position of the person relative to the computing device and executes a change of state in the computing device based on the position of the person relative to the computing device. If the object is not a person, the computing device remains in a reduced power state.11-15-2012
20120287035Presence Sensing - One embodiment may take the form of a method of operating a computing device in a reduced power state and collecting a first set of data from at least one sensor. Based on the first set of data, the computing device determines a probability that an object is within a threshold distance of the computing device and, if so, the device activates at least one secondary sensor to collect a second set of data. Based on the second set of data, the device determines if the object is a person. If it is a person, a position of the person relative to the computing device is determined and the computing device changes its state based on the position of the person. If the object is not a person, the computing device remains in a reduced power state.11-15-2012
20120287036PORTABLE TERMINAL DEVICE HAVING AN ENLARGED-DISPLAY FUNCTION, METHOD FOR CONTROLLING ENLARGED DISPLAY, AND COMPUTER-READ-ENABLED RECORDING MEDIUM - The disclosed portable terminal device is provided with: a first display unit (11-15-2012
20120287034METHOD AND APPARATUS FOR SHARING DATA BETWEEN DIFFERENT NETWORK DEVICES - Disclosed are a user interface for a data sharing function according to network connection between network electronic devices and a user device for operating a data sharing function using same. The method for sharing data between network electronic devices, includes: searching network electronic devices located at a periphery of a user device when an input for performing a data sharing function is sensed; classifying the searched network electronic devices into transmission side network electronic devices and reception side network electronic devices; allotting the searched network electronic devices to a first region for receiving data and a second region for transmitting the data, respectively; and configuring and displaying a user interface for a data sharing function based on the network electronic devices allotted to the first region and the second regions.11-15-2012
20120287032SLIM PROFILE MAGNETIC USER INTERFACE DEVICES - Slim profile magnetic user interface devices (slim UIDs) are disclosed. A slim UID may include a slim profile housing, a movable actuator assembly having user contact surfaces on opposite sides, along with a magnet, magnetic sensor, restoration element, and processing element. User mechanical interaction with the actuator element may be sensed by the magnetic sensor and processed to generate output signals usable by a coupled electronic computing system.11-15-2012
20120287033OPERATION INPUT DEVICE - An operation input device includes: an operation body having a handle portion, tilting around a rotation center point when a user tilts the operation axis line of the handle portion, and tilting in a predetermined number of tilting directions; multiple detection portions, the number of the detection portions being less than the predetermined number of tilting directions, each detection portion outputting a first output value when the operation body tilts in a direction corresponding to the detection portion and outputting a second output value when the operation body tilts in a direction not corresponding to the detection portion; and a determination device that determines a tilting direction of the operation body based on information on the number of first output values and information on a part of the detection portions that have outputted the first output values.11-15-2012
20120287037LIGHT-EMITTING DEVICE, AND LIQUID CRYSTAL DISPLAY DEVICE AND IMAGE DISPLAY DEVICE THAT USE THE SAME - A light-emitting device used in a liquid crystal display device is provided with a planar illuminator that focuses emitted light on a predetermined light focus point, and an optical deflector that two dimensionally deflects the light from the planar illuminator. The planar illuminator switches an emission direction of the light to enable alternate switching between a first light focus state in which the predetermined light focus point is the position of a right eye of a viewer and a second light focus state in which the predetermined light focus point is the position of a left eye of the viewer. The optical deflector can modulate each of the predetermined light focus point in the first light focus state and the predetermined light focus point in the second light focus state according to movement of the viewer.11-15-2012
20100214212Display Device, Controlling Method and Display System Thereof - A display device is provided. The display device includes a command processing unit, a command converting unit, a universal serial bus (USB) interface and a display module. The command processing unit processes a remote-control command from a remote-controller. The command converting unit generates a human interface device (HID) command corresponding to the remote-control command. The USB interface outputs the HID command to an external host which generates an image in response to the HID command. Then the display module displays the image.08-26-2010
20100207878ILLUMINATION DEVICE, METHOD FOR FABRICATING THE SAME, AND SYSTEM FOR DISPLAYING IMAGES UTILIZING THE SAME - The invention provides an illumination device, method for fabricating the same, and system for displaying images utilizing the same. The illumination device includes a substrate, a first electrode, an illumination layer, and a second electrode. The substrate has a plurality of illumination regions. The first electrode overlies the substrate and has a first bump disposed in a first illumination region of the plurality of the illumination regions. The illumination layer overlies the first electrode. The second electrode overlies the illumination layer.08-19-2010
20100207877Image Generation System - An image generation system for generating an image on a display screen. The image generation system includes an eye-tracking system capable of determining a user's eye orientation and outputting a signal indicative of same. The image generation system also includes a bio-feedback sensor capable of detecting activity of one or more physiological functions of the user and outputting a signal indicative of the level of activity. A processor is included and is adapted to receive and process the output signals from the eye-tracking system and bio-feedback sensor. The processor determines an image to be generated on the display screen indicative of the signals from the eye-tracking system and bio-feedback sensor.08-19-2010
20100207875COMMAND CONTROL SYSTEM AND METHOD THEREOF - The invention discloses a command control system including a light emitting unit, an image capturing unit, a storage unit, and a processing unit. The processing unit is coupled with the image capture unit and the storage unit. The light emitting unit emits light to form an illumination area. The image capture unit captures a plurality of pieces of image information in the illumination area. The storage unit stores different commands corresponding to the image information. The processing unit performs functions according to the commands corresponding to the image information.08-19-2010
20100207874Interactive Display System With Collaborative Gesture Detection - An interactive content delivery system (08-19-2010
20100207873METHOD, SYSTEM AND SERVER PLAYING MEDIA USING USER EQUIPMENT WITH MOTION SENSOR - Disclosed is a method for receiving playable media stored in a server by user equipment having a motion sensor and playing the media according to the motion of the user equipment. The method includes the steps of (a) designating a part of the media as a first receiving area by the server and transmitting it to the user equipment; (b) changing a playback area, in which the media are played, according to the motion of the user equipment and playing a part of the media; and (c) requesting by the user equipment that the server provide a second receiving area, which is a part of the media including the playback area, when the playback area is outside the first receiving area of the media, receiving the second receiving area, and playing the media. When media stored in the server are to be played initially, it is unnecessary to download the entire media, so that the user equipment can play the media more efficiently.08-19-2010
20100207872OPTICAL DISPLACEMENT DETECTING DEVICE AND OPERATING METHOD THEREOF - An operating method of an optical displacement detecting device includes the steps of: capturing a plurality of images; obtaining a displacement according to the images; obtaining a characteristic variation of the images; and suppressing the output of the displacement when the characteristic variation matches a predetermined rule. The present invention further provides an optical displacement detecting device.08-19-2010
20100207871METHOD AND PORTABLE APPARATUS - A method for a portable apparatus includes detecting a movement of the portable apparatus, and determining that the movement is associated with a user input for retrieving a value of a status of the portable apparatus; determining a value of the status; and presenting the value to the user. Corresponding portable apparatuses and computer program product are also presented.08-19-2010
20100207870DEVICE AND METHOD FOR INPUTTING SPECIAL SYMBOL IN APPARATUS HAVING TOUCH SCREEN - A device to input a special symbol in an apparatus having a touch screen includes a touch sensor to detect a user touch at a first position, a text input controller to switch from a general character input mode to a special symbol input mode if the user touch is maintained for at least a predetermined time, and a special symbol display unit to display a special symbol at the first position. A method for inputting a special symbol includes a touch detecting step of detecting the user touch at the first position, switching to the special symbol input mode if the user touch is maintained for at least a predetermined time, and displaying a special symbol at the first position in the special symbol input mode.08-19-2010
20100207869SECURE MAN-MACHINE INTERFACE FOR MANAGING GRAPHICAL OBJECTS ON A VIEWING SCREEN - The field of the invention is that of man-machine interface devices for viewing screen (08-19-2010
20120139829DISPLAY DEVICE - A technique that can simply determine pass/failure of signal interconnects for chipping or cracking of a substrate without increase in the number of manufacturing steps.06-07-2012
20120139827METHOD AND APPARATUS FOR INTERACTING WITH PROJECTED DISPLAYS USING SHADOWS - A method, computer readable medium and apparatus for interacting with a projected image using a shadow are disclosed. For example, the method projects an image of a processing device to create the projected image, and detects a shadow on the projected image. The method interprets the shadow as a display formatting manipulation command and sends the display formatting manipulation command to an application of the processing device to manipulate a display format of the projected image.06-07-2012
20120139826User Control of the Display of Matrix Codes - An electronic device determines to transmit an image including a matrix code to a display, receives input specifying to alter the matrix code, generates an updated image according to the input, and transmits the updated image to the display. The device may alter a size and/or position of the matrix code, a display duration and/or complexity of the matrix code, and so on. The device may generate the matrix code and modify it in response to input, receive different matrix code versions and select a different version in response to input, receive the image including the matrix code and generate a replacement to overlay over the image, and so on Additionally, independent of input, the device may receive an image, detect an included first matrix code, generate a second matrix code based on the first, and generate an updated image by adding the second matrix code to the image.06-07-2012
20130009860INFORMATION DISPLAY APPARATUS - An information display apparatus includes: an outputting device capable of (i) displaying a first image in which data values corresponding to at least a first portion out of a plurality of data values included in one dataset of a plurality of datasets are indicated by characters and in which at least one portion of the data values can be changed and (ii) displaying a second image in which data values corresponding to at least a second portion out of the plurality of data values included in the one dataset are graphically indicated and in which at least one portion of the data values can be changed; and a controlling device for changing at least one portion of the data values included in the one dataset in accordance with a received change instruction if the instruction for one of the first image and second image is received by an inputting device.01-10-2013
20130009865USER-CENTRIC THREE-DIMENSIONAL INTERACTIVE CONTROL ENVIRONMENT - A computer-implemented method and system for controlling various electronic devices by recognition of gestures made by a user within a particular space defined in front of the user are provided. An example method may comprise generating a depth map of a physical scene, determining that a head of the user is directed towards a predetermined direction, establishing a virtual sensing zone defined between the user and a predetermined location, identifying a particular gesture made by the user within the virtual sensing zone, and selectively providing to the electronic device a control command associated with the particular gesture. The particular gesture may be performed by one or more characteristic forms provided by the user within the virtual sensing zone being in an active state. The characteristic forms are forms reliably distinguishable from casual forms by means of computer vision and having certain attributes, which can reliably reflect user intent.01-10-2013
20130009863DISPLAY CONTROL APPARATUS, DISPLAY CONTROL METHOD, AND PROGRAM - A display control apparatus controls display of a transparent display which includes a screen configured to transmit light arriving from an object located on a side of a second surface so that the object is viewable from a viewpoint located on a side of a first surface which is an opposite surface to the second surface. The display control apparatus includes: an acquisition unit that acquires position information indicating relative positional relations between the transparent display and the viewpoint and between the transparent display and the object; and a display control unit that controls the display of the transparent display based on the position information.01-10-2013
20130009862DISPLAY APPARATUS - A display apparatus including an image generator, a projection lens set, a depth detecting module detecting the position of user, and a control unit is provided, wherein the control unit is electrically connected to the image generator, the projection lens set and the depth detecting module. An image displayed by the image generator is projected through the projection lens set and generates a floating real image between the projection lens set and the user. Each beam forming the floating real image has a light-cone angle θ. The image generator and the projection lens adjust the position of the floating real image according to the position of user. The size of the floating real image is L, the distance between two eyes of the user is W, the distance between the user and the floating real image is D, and the light-cone angle θ satisfies the formula of01-10-2013
20130009858SYSTEMS AND METHODS FOR LOCKING AN ELECTRONIC DEVICE - Systems and methods for locking an input device of an electronic device are described herein. An example method includes detecting a moving action of a housing of the electronic device from an open position to a closed position. The method includes detecting at least a first condition of the electronic device after detection of the moving action from the open position to the closed position and locking the input device upon detection of the first condition within a first time interval based on the electronic device being moved to the closed position.01-10-2013
20130009861METHODS AND SYSTEMS FOR CONTROLLING DEVICES USING GESTURES AND RELATED 3D SENSOR - Provided are computer-implemented methods and systems for controlling devices using gestures and a 3D sensor that enables implementing the above. In one embodiment, the method proposed herein may be based on defining at least one sensor area within the space surrounding a user of a controlled device; associating this sensor area with at least one user gesture; associating the combination of the user gesture and sensor area with an actionable command; identifying the direction of the line of sight of the user and a focal point of the line of sight of the user; and, if the line of sight of the user is directed a sensor area, issuing an actionable command corresponding to the combination of the sensor area and the gesture that the user makes while looking at this sensor area.01-10-2013
20130009868DISPLAY DEVICE AND DISPLAY METHOD - The present invention provides a display apparatus and a display method for realizing control for display operations by a user precisely reflecting the user's status, i.e., the user's intentions, visual state and physical conditions. Worn as an eyeglass-like or head-mount wearable unit for example, the display apparatus of the present invention enables the user to recognize visibly various images on the display unit positioned in front of the user's eyes thereby providing the picked up images, reproduced images, and received images. As control for various display operations such as switching between the display state and the see-through state, display operation mode and selecting sources, the display apparatus of the present invention acquires information about either behavior or physical status of the user, and determines either intention or status of the user in accordance with the acquired information, thereby controlling the display operation appropriately on the basis of the determination result.01-10-2013
20100164861IMAGE SYSTEM CAPABLE OF SWITCHING PROGRAMS CORRESPONDING TO A PLURALITY OF FRAMES PROJECTED FROM A MULTIPLE VIEW DISPLAY AND METHOD THEREOF - An image system includes a multiple view display for displaying a plurality of frames simultaneously. The plurality of frames only can be viewed at different visible ranges respectively. The image system further includes a executing program means for inputting an executing program command, and a control means coupled to the executing program means for executing a program corresponding to a frame of the plurality of frames according to the executing program command transmitted from the executing program means.07-01-2010
20130009870INPUT DEVICE, INPUT METHOD, AND PROGRAM - An input device comprises: a detection unit that detects a body movement generated by tapping a user body as detection data; and an input information determination unit that refers to the detection data, determines a tap position based on a fact that the detection data varies depending on the tap position, and outputs an operation command associated with the determined tap position.01-10-2013
20130016038MOTION DETECTION METHOD AND DISPLAY DEVICEAANM Yu; Shu-HanAACI Taoyuan CountyAACO TWAAGP Yu; Shu-Han Taoyuan County TWAANM Lin; Chia-HoAACI Hsinchu CountyAACO TWAAGP Lin; Chia-Ho Hsinchu County TW - A motion detection method for a display device includes the steps of capturing images from a position on the display device to generate a plurality of capture images, calculating a moving status of the display device according to the plurality of capture images, and adjusting a display range of the display device relative to an image data according to the moving status.01-17-2013
20130016037REDUCING OR ELIMINATING THE BLACK MASK IN AN OPTICAL STACKAANM Wang; Hung-JenAACI Longtan TownshipAACO TWAAGP Wang; Hung-Jen Longtan Township TW - A reflective subpixel array may be formed in which an absorption layer is formed on a back substrate, which may obviate the need for a black mask on a front substrate upon which the reflective subpixel array is formed. In some implementations, the black mask layer may be formed only in post areas on the front substrate. The absorption layer may absorb light that enters between subpixel rows and/or columns. The absorption layer may include at least one highly conductive layer that can form part of the signal routing for the display. Conductive spacers may be formed to connect the conductive absorption layer to a conductive layer of the subpixel array.01-17-2013
20110141007DATA PROCESSING APPARATUS, DATA PROCESSING SYSTEM, AND DISPLAY CONTROL METHOD FOR CONTROLLING DISPLAY IN DATA PROCESSING APPARATUS - A data processing apparatus includes a control apparatus configured to control the data processing apparatus and a display apparatus configured to display information, wherein the control apparatus includes a first display control unit configured to control display in the display apparatus and a transfer unit configured to transfer data to be displayed by the display apparatus to the display apparatus, wherein the display apparatus includes a second display control unit configured to control display in the display apparatus and a switching unit configured to switch between the first display control unit and the second display control unit, and wherein, after the switching unit switches from the first display control unit to the second display control unit, the second display control unit controls display in the display apparatus based on the data transferred by the transfer unit.06-16-2011
20110157007OPTICAL PEN AND OPERATING METHOD OF THE SAME - An optical pen comprises a housing, a pen body, a pen tip, a switch and a control unit. The pen body, disposed inside the housing, has a hinge mechanism allowing the pivoting of the pen body. The pen tip is disposed at a front end of the pen body and protruded from the housing. When the pen tip touches a projection surface, the pen body is rotated through the hinge mechanism to trigger the switch. The switch disposed inside the housing is electrically connected to the control unit. When the switch is triggered, the control unit outputs an electrical signal.06-30-2011
20110148753Identifying a characteristic of an individual utilizing facial recognition and providing a display for the individual - A method may include automatically remotely identifying at least one characteristic of an individual via facial recognition; and providing a display for the individual, the display having a content at least partially based on the identified at least one characteristic of the individual. A system may include a facial recognition module configured for automatically remotely identifying at least one characteristic of an individual via facial recognition; and a display module coupled with the facial recognition module, the display module configured for providing a display for the individual, the display having a content at least partially based on the identified at least one characteristic of the individual.06-23-2011
20110157010ELECTROMECHANICAL DISPLAY DEVICES AND METHODS OF FABRICATING THE SAME - MEMS devices include materials which are used in LCD or OLED fabrication to facilitate fabrication on the same manufacturing systems. Where possible, the same or similar materials are used for multiple layers in the MEMS device, and use of transparent conductors for partially transparent electrodes can be avoided to minimize the number of materials needed and minimize fabrication costs. Certain layers comprise alloys selected to achieve desired properties. Intermediate treatment of deposited layers during the manufacturing process can be used to provide layers having desired properties.06-30-2011
20110157009DISPLAY DEVICE AND CONTROL METHOD THEREOF - A display device and a control method thereof are provided. The display device comprises a camera obtaining an image including a gesture of a user and a controller extracting the image of the gesture from the obtained image and executing a function mapped to the extracted gesture when the range of the gesture exceeds a threshold corresponding to the user. The method of controlling the display device executes functions respectively corresponding to specific gestures of users when the gestures exceed thresholds respectively corresponding to the users to operate the display device in a manner most suitable for the range of the gesture of each user.06-30-2011
20110157008DEVICE-CONTROL SYSTEM, TERMINAL APPARATUS, AND DEVICE - A device-control system includes a device operated by a remote controller and a terminal apparatus displaying an image. The device-control system controls the device via a network. The terminal apparatus includes an output section, a terminal's receiver for receiving an image similar to the appearance of the remote controller for operating the device, a terminal's output processor for controlling the output section to output the image received by the terminal's receiver, and a terminal's transmitter for transmitting operational information generated by an operation performed by use of the terminal apparatus with respect to the image outputted by the output section. The device includes a device's processor for performing processing in accordance with the operational information.06-30-2011
20110157006INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM - An information processing apparatus includes a detection section configured to detect an operating body in proximity to a display screen, an identification section configured to identify whether the operating body detected by the detection section is in proximity to a selection item included in the display screen, and a display control section configured to temporarily superimpose and display a region including selection item content of the selection item over a previously displayed display region when the identification section identifies the operating body as being in proximity to the selection item and configured to end a display of the region including the selection item content of the selection item when the detection section no longer detects the operating body as being in proximity to the display screen.06-30-2011
20110157005HEAD-MOUNTED DISPLAY - A head-mounted display includes detection means that is mounted on the body of a user for detecting coordinates of input, which are coordinates written/drawn the user into a two-dimensional detection area, coordinates conversion means for detecting start position coordinates of the input by the user in the two-dimensional detection area based on the coordinates of input, converting the start position coordinates into initial position coordinates in a display area of an image display, and determining trajectory coordinates in the display area by using the initial position coordinates and a positional relationship between the coordinates of input and the start position coordinates, and the trajectory image generation means for generating a trajectory image of the input based on the trajectory coordinates and outputting to the image display.06-30-2011
20110157004Information processing apparatus,information processing method, program, control target device, and information processing system - There is provided a remote commander including an acquisition section which acquires layout information having one or a plurality of pieces of object information from a control target device via a communication section, the object information being formed by associating command identification information for identifying a command with respect to the control target device, object identification information for identifying an object, and coordinate information indicating a position of the object on a screen with each other, and a display control section which causes the object to be displayed at the position on the screen indicated by the coordinate information, the object being identified by the object identification information with respect to each of the pieces of object information that the layout information acquired by the acquisition section has.06-30-2011
20110157003HAPTIC FEEDBACK DEVICE - A haptic feedback apparatus includes a screen and a piezoelectric vibrator partially connected with the screen. The screen comprises a display viewing area and a frame surrounding the display viewing area. The piezoelectric vibrator engaged with the screen defines a moving portion having a projecting portion extending from one distal end of the moving portion toward the screen. Wherein, the screen defines a cavity and the projecting portion of the piezoelectric vibrator is at least partially accommodated in the cavity. The manufacture of the haptic feedback apparatus is simple and low-cost.06-30-2011
20110157002HAPTIC FEEDBACK DEVICE - A haptic feedback device includes a screen and a piezoelectric vibrator partially connected with the screen. The screen includes a display viewing area and a frame surrounding the display viewing area. The piezoelectric vibrator engaged with the screen defines a moving portion having a projecting portion extending toward the screen. The screen defines a cavity and the projecting portion of the piezoelectric vibrator is at least partially accommodated in the cavity.06-30-2011
20110157000PROJECTION SYSTEM AND METHOD - A projection system includes first and second cameras, a projector, and a processing unit. The first camera captures an object to obtain an image of the object. The projector projects the image of the object and a number of symbols to a projection region. The symbols in the projection region forms controlling symbols. The second camera captures the projection region to obtain an image of the projection region. The image of the projection region is detected to determine whether one controlling symbol is selected. If a controlling symbol is selected, the processing unit controls the first camera, the second camera, or the projector correspondingly.06-30-2011
20110156999GESTURE RECOGNITION METHODS AND SYSTEMS - Gesture recognition methods and systems are provided. First, a plurality of gesture templates are provided, wherein each gesture template defines a first gesture characteristic and a corresponding specific gesture. Then, a plurality of images is obtained, and a multi-background model is generated accordingly. At least one object image is obtained according to the multi-background model, wherein the object image includes at least an object having a plurality of edges. The included angles of any two adjacent edges of the object image are gathered as statistics to obtain a second gesture characteristic corresponding to the object image. The second gesture characteristic of the object image is compared with the first gesture characteristic of each gesture template. The specific gesture corresponding to the first gesture characteristic is obtained, when the second gesture characteristic is similar to the first gesture characteristic.06-30-2011
20110156998METHOD FOR SWITCHING TO DISPLAY THREE-DIMENSIONAL IMAGES AND DIGITAL DISPLAY SYSTEM - A method for switching to display three-dimensional (3D) images and a digital display system are provided. In the method, a digital display is activated to receive image data comprising at least one image and to detect a triggering signal, which is generated by detecting whether a 3D glasses is being used through a software means or a hardware means. According to the triggering signal, the digital display switches between formats of two-dimensional (2D) image and 3D image, so as to display the images.06-30-2011
20130021239DISPLAY DEVICE - A display device includes a display panel including a plurality of pixels, a shutter panel including a driver circuit, a liquid crystal, and light-transmitting electrodes provided in a striped manner, and a positional data detector configured to detect a positional data of a viewer. The shutter panel is provided over a display surface side of the display panel, a width of one of the light-transmitting electrodes in the shutter panel is smaller than that of one of the plurality of pixels, and the driver circuit in the shutter panel is configured to selectively output signals for forming a parallax barrier to the light-transmitting electrodes. The parallax barrier is capable of changing its shape in accordance with the detected positional data.01-24-2013
20130021233APPARATUS, SYSTEM, AND METHOD FOR PROVIDING FEEDBACK SENSATIONS OF TEMPERATURE AND HARDNESS-SOFTNESS TO A CONTROLLER - Described herein are hand-held controller, system, and method for providing real-time sensations of temperature and/or hardness-softness to the hand-held controller. The hand-held controller comprises a first region configured to be touched by a user and to provide a real-time computer programmable hardness-softness sensation to the user in response to a first trigger signal generated by an interactive program; and a first mechanism, coupled to the first region, to cause the first region to harden relative to a first state, and to cause the first region to soften relative to a second state.01-24-2013
20130021234APPARATUS, SYSTEM, AND METHOD FOR PROVIDING FEEDBACK SENSATIONS OF TEMPERATURE AND TEXTURE TO A CONTROLLER - Described herein are hand-held controller, system, and method for providing real-time sensations of temperature and texture to a user of the hand-held controller. The hand-held controller comprises a first region to be touched by a user and to provide a real-time computer programmable texture sensation to a user in response to a first trigger signal generated by an interactive program; and a first mechanism, coupled to the first region, to cause the first region to roughen relative to a first state, and to cause the first region to smooth relative to a second state.01-24-2013
20130021232APPARATUS, SYSTEM, AND METHOD FOR PROVIDING FEEDBACK SENSATIONS OF TEMPERATURE, TEXTURE, AND HARDNESS-SOFTNESS TO A CONTROLLER - Described herein are hand-held controller, system, and method for providing real-time programmable sensations of texture, hardness-softness, and temperature (thermal) to the hand-held controller. The hand-held controller comprises a first region configured to be touched by a user and to provide a real-time programmable hardness-softness sensation to the user in response to a first trigger signal generated by an interactive program; a second region to be touched by the user and to provide real-time programmable hardness-softness sensations to the user in response to a second trigger signal generated by the interactive program; and a third region to be touched by the user and to provide real-time programmable thermal sensations to the user in response to a third trigger signal generated by the interactive program.01-24-2013
20130021241Portable Information Display Terminal - Disclosed is a portable information display terminal which can start an application program correspondingly to a use state even if assumed to be used in a placed state. The portable information display terminal (01-24-2013
20130021240METHOD AND DEVICE FOR CONTROLLING AN APPARATUS AS A FUNCTION OF DETECTING PERSONS IN THE VICINITY OF THE APPARATUS - A method for controlling an electronic apparatus, includes steps of: acquiring an image of the environment of the apparatus, detecting the presence of human faces in the image acquired, estimating a respective position of each face detected in relation to the apparatus, and sending a signal to the apparatus to enable a function of the apparatus if a condition is met relating to a number of faces detected in the image and/or the estimated position of each detected face.01-24-2013
20130021235APPARATUS, SYSTEM, AND METHOD FOR PROVIDING FEEDBACK SENSATIONS OF TEXTURE AND HARDNESS-SOFTNESS TO A CONTROLLER - Described herein are hand-held controller, system, and method for providing real-time sensations of texture and hardness-softness to a user of the hand-held controller. The hand-held controller comprises a first region to be touched by a user and to provide real-time computer programmable texture sensations to the user in response to a first trigger signal generated by an interactive program; and a second region to be touched by the user and to provide real-time computer programmable hardness-softness sensations to the user in response to a second trigger signal generated by the interactive program.01-24-2013
20120242573Vector-Specific Haptic Feedback - In one or more embodiments, vector-specific movement can be imparted to a user interface device (UID) to provide vector-specific haptic feedback. In at least some embodiments, this vectored movement can be based on input received by the UID. The input can include information associated with the user's interaction with an associated device integrated with or communicatively linked with the UID, and or with an application implemented on the associated device. In at least some embodiments, the UID can be configured with a controller, a microprocessor(s), and a vector-specific actuator that includes an electrically-deformable material.09-27-2012
20080252595Method and Device for Virtual Navigation and Voice Processing - An apparatus for virtual navigation and voice processing is provided. A system that incorporates teachings of the present disclosure may include, for example, a computer readable storage medium having computer instructions for processing voice signals captured from a microphone array, detecting a location of an object in a touchless sensory field of the microphone array, and receiving information from a user interface in accordance with the location and voice signals.10-16-2008
20080252594Vibration Actuator with a Unidirectional Drive - A haptic feedback generation system includes a linear resonant actuator and a drive circuit. The drive circuit is adapted to output a unidirectional signal that is applied to the linear resonant actuator. In response, the linear resonant actuator generates haptic vibrations.10-16-2008
20090009466Force-sensing orthotic electric device controller - The invention is directed to devices and methods for assisting individuals with poor extremity function (i.e., upper extremity function) to control devices. In one embodiment, a powered device is provided that comprises a controller and an orthosis communicably connected to the controller. An orthosis includes a harness worn by a user on a body part and a force sensing transducer positioned between the harness and the body part, wherein force applied to the transducer by the body part is communicated to the controller for controlling movement of the powered device. A method for controlling a device having a controller includes attaching a harness to a user's body part and applying a force by the body part onto a force sensing transducer positioned between the harness and the body part, wherein the force is communicated to the controller for controlling the device.01-08-2009
20080231595Remote control apparatus and method of interacting with a multimedia timeline user interface - Remote control devices and methods of interacting with a multimedia timeline user interface are disclosed. A remote control apparatus may include a first selector to make a selection of a graphical user interface associated with a multimedia timeline. The remote control apparatus may include a transmitter to transmit the selection to a multimedia device configured to provide the graphical user interface to a display device. The remote control apparatus may include at least one date range selector to modify a date range displayed on the graphical user interface.09-25-2008
20130169526SYSTEMS AND METHODS FOR MOBILE DEVICE PAIRING - Tools (systems, apparatuses, methodologies, computer program products, etc.) for pairing electronic devices including touchscreen-enabled electronic devices, and for facilitating communication between paired electronic devices.07-04-2013
20130169527INTERACTIVE VIDEO BASED GAMES USING OBJECTS SENSED BY TV CAMERAS - A method and apparatus for interactive TV camera based games in which position or orientation of points on a player or of an object held by a player are determined and used to control a video display. Both single camera and stereo camera pair based embodiments are disclosed, preferably using stereo photogrammetry where multi-degree of freedom information is desired. Large video displays, preferably life-size may be used where utmost realism of the game experience is desired.07-04-2013
20130169529ADJUSTING AN OPTICAL GUIDE OF A THREE-DIMENSIONAL DISPLAY TO REDUCE PSEUDO-STEREOSCOPIC EFFECT - A device may include sensors, a display, an optical guide, and one or more processors. The sensors may obtaining tracking information associated with a user. The display may include pixels for displaying images. The optical guide may include optical elements, each of the optical elements directing light rays from one or more of the pixels. In addition, the one or more processors may be configured determine a relative location of the user based on the tracking information obtained by the sensors, obtain values for control variables that are associated with the optical elements based on the relative location of the user, display a stereoscopic image via the display, and control the optical elements based on the values to direct the stereoscopic image to the relative location and to prevent a pseudo-stereoscopic image from forming at the relative location.07-04-2013
20130169519METHODS AND APPARATUS TO PERFORM A DETECTION OPERATION - A method and apparatus determine a difference value, the determined difference value reflecting a difference between a plurality of presence values. In an embodiment, the method and apparatus perform an operation associated with the plurality of presence values, based on the determined difference value.07-04-2013
20130169520BENDING THRESHOLD AND RELEASE FOR A FLEXIBLE DISPLAY DEVICE - A flexible display device and a method for accurately recognizing a user's flex input bending of the flexible display device is described. The present invention is able to discard unintentional flexing of the flexible display device while being able to accurately recognize a user's intended flex input command based on a number of bending degree thresholds. A first bending threshold must be overcome in order to initially recognize a user's flex input as a valid flex input command. Then the user's flex input must fall below a second bending threshold in order to cease the recognition of the user's flex input as a valid flex input.07-04-2013
20130169523ELECTRONIC DEVICE AND METHOD FOR RELIEVING VISUAL FATIGUE USING THE ELECTRONIC DEVICE - An electronic device includes a visual sensor and a display screen. The visual sensor senses whether a user is looking at the display screen. If the user is looking at the display screen, the electronic device adjusts a font size of a font being displayed on the display screen. If the user looks at the display screen for not less than a first predefined time, the electronic device prompts the user to have a rest and turn off the display screen. After the electronic device has been turned off for more than a second predefined time, the display screen is turned on again automatically.07-04-2013
20130169524ELECTRONIC APPARATUS AND METHOD FOR CONTROLLING THE SAME - An electronic apparatus and a control method thereof are provided. In response to receiving a voice start command by the electronic apparatus, the control method of the electronic apparatus changes mode of the electronic apparatus to a voice task mode in which the electronic apparatus is controlled in accordance with user's voice, displays voice guide information including a plurality of voice commands for commanding the electronic apparatus to perform tasks in the voice task mode on the electronic apparatus, and changes at least one voice command from among the plurality of voice commands with a different voice command. Accordingly, the user is able to control the electronic apparatus more conveniently and efficiently.07-04-2013
20080224994Slide Operation Apparatus - A slide operation apparatus capable of preventing a movable unit from being unintentionally moved in a box body, while being easy to assemble and simple in construction. The movable unit includes a gondola to which an operating element is fixed. The gondola is adapted to be movable relative to upper and lower guide bars. A sliding contact assembly includes a plate spring to which an insulation sheet is assembled. The sliding contact assembly in a curved state is mounted to a fixture portion of the operating element by having pawls of the plate spring engaged with notches formed in the fixture portion. During the entire movement process of the movable unit, the curved convex portion of the sliding contact assembly is in sliding contact with the lower guide bar.09-18-2008
20110260962INTERFACING WITH A COMPUTING APPLICATION USING A MULTI-DIGIT SENSOR - A technology is described for interfacing with a computing application using a multi-digit sensor. A method may include obtaining an initial stroke using a single digit of a user on the multi-digit sensor. A direction change point for the initial stroke can be identified. At the direction change point for the initial stroke, a number of additional digits can be presented by the user to the multi-digit sensor. Then a completion stroke can be identified as being made with the number of additional digits. A user interface signal to can be sent to the computing application based on the number of additional digits used in the completion touch stroke. In another configuration of the technology, the touch stroke or gesture may include a single stroke where user interface items can be selected when additional digits are presented at the end of a gesture.10-27-2011
20110273368Extending Digital Artifacts Through An Interactive Surface - A unique system and method that facilitates extending input/output capabilities for resource deficient mobile devices and interactions between multiple heterogeneous devices is provided. The system and method involve an interactive surface to which the desired mobile devices can be connected. The interactive surface can provide an enhanced display space and customization controls for mobile devices that lack adequate displays and input capabilities. In addition, the interactive surface can be employed to permit communication and interaction between multiple mobile devices that otherwise are unable to interact with each other. When connected to the interactive surface, the mobile devices can share information, view information from their respective devices, and store information to the interactive surface. Furthermore, the interactive surface can resume activity states of mobile devices that were previously communicating upon re-connection to the surface.11-10-2011
20130176201Corner Control - Methods, Methods, program products, and systems for corner control are described. Each of the four corners of a rectangular display field can be individually configured to be a rounded corner or an angled corner. In some implementations, a method can include providing a user interface item for display. The user interface item can include four control elements. Each of the control elements can correspond to a corner of a display field. Each of the control elements can individually and independently control a shape of the corresponding corner of the display field. The display field can have one or more corners in rounded shape and one or more corners in angled shape, according to user input received through the user interface item.07-11-2013
20130176208ELECTRONIC EQUIPMENT - A mobile phone which is an example of electronic equipment includes an infrared camera and an infrared LED. The infrared camera is arranged above a display and the infrared LED is arranged below the display. A user, by an eye-controlled input, designates a button image or a predetermined region on a screen. When a line of sight is to be detected, an infrared ray (infrared light) emitted from the infrared LED arranged below the display is irradiated to a lower portion of a pupil. Accordingly, even in a state that the user slightly closes his/her eyelid, the pupil and a reflected light of the infrared light can be imaged.07-11-2013
20130169525ELECTRONIC APPARATUS AND METHOD FOR CONTROLLING THE SAME - An electronic apparatus and a control method thereof are provided, which displays first voice guide information indicating voice commands available to control the electronic apparatus, and if a command to control an external device connected to the electronic apparatus is received, changes the first voice guide information and displays second voice guide information to indicating voice commands available to control the external device.07-04-2013
20130169528DISPLAY DEVICE - A display device including a display panel to display images, a bezel provided at an outside of the display panel, and a signal input unit including a camera and a microphone, wherein the signal input unit is disposed at an upper end of the bezel, is disclosed. The camera and the microphone are integrated into the display device, thereby improving installation efficiency and external appearance quality of the display device. The microphone is distant from the speaker, thereby improving performance of the microphone.07-04-2013
20130176202MENU SELECTION USING TANGIBLE INTERACTION WITH MOBILE DEVICES - An electronic device (such as a mobile device) displays on a screen of the device, a live video captured by a camera in the device. While the live video is being displayed, the device checks if a first predetermined condition is satisfied. When the first predetermined condition is satisfied the device displays a menu on the screen. The menu includes multiple menu areas, one of which is to be selected. While the menu is being displayed on the screen, the device checks if a second predetermined condition is satisfied, e.g. by a movement of a predetermined object in real world outside the device. When the second predetermined condition is satisfied, the device displays on the screen at least an indication of a menu area as being selected from among multiple menu areas in the displayed menu.07-11-2013
20130176203DISPLAY APPARATUS AND METHOD OF DISPLAYING THREE-DIMENSIONAL IMAGE USING THE SAME - A display apparatus includes a display panel and a liquid crystal lens. The display panel includes a first pixel displaying an N-th portion of a left image corresponding to an N-th position in the left image and a second pixel displaying an N-th portion of a right image corresponding to the N-th position in the right image. The second pixel is spaced apart from the first pixel with additional pixels disposed between the second and first pixels. The liquid crystal lens includes a unit lens directing the N-th left image to a left eye of an observer and the N-th right image to a right eye of the observer where N is a positive integer.07-11-2013
20130176205ELECTRONIC APPARATUS AND CONTROLLING METHOD FOR ELECTRONIC APPARATUS - According to one embodiment, an electronic apparatus configured to control an external apparatus, the electronic apparatus includes, a communicator configured to communicate with the external apparatus, a recognition module configured to recognize a connection apparatus connected to the external apparatus, an operation menu generator configured to generate an operation menu which is used to control the external apparatus or the connection apparatus, and a controller configured to control in such a manner that a divided operation screen including display regions which are used to display the operation menus is displayed in a display and a signal is transmitted to the external apparatus in accordance with an operation in the divided operation screen.07-11-2013
20130176206CAMERA BASED SENSING IN HANDHELD, MOBILE, GAMING, OR OTHER DEVICES - Method and apparatus are disclosed to enable rapid TV camera and computer based sensing in many practical applications, including, but not limited to, handheld devices, cars, and video games. Several unique forms of social video games are disclosed.07-11-2013
20130176207IMPLANTED DEVICES AND RELATED USER INTERFACES - Embodiments of the invention generally relate to electronic devices capable of being implanted beneath the skin of a human user. The electronic devices include input devices for receiving input from a user, and output devices for output signals or information to a user. The electronic devices may optionally include one or more sensors, batteries, memory units, and processors. The electronic devices are protected by a protective packaging to reduce contact with bodily fluids and to mitigate physiological responses to the implanted devices.07-11-2013
20130176209INTEGRATION SYSTEMS AND METHODS FOR VEHICLES AND OTHER ENVIRONMENTS - Described herein are integration systems and methods for providing a cohesive user experience within an environment of a plurality of products. An exemplary integration system may comprise a gateway for bidirectional communication with the products. The integration system may also comprise a central or remote database defining the user interface design for a plurality of products, and a database that provides new features for a plurality of extensible products. An input to an interaction point on a first of the products may be transmitted to the gateway, which may interpret the input and transmit output instructions to the first product or to another product in the environment. Audio, visual, and tactile inputs and outputs may be provided through the gateway. In some embodiments, the database may provide a user interface skin or theme to the products, so that a standard look and feel is provided across the plurality of products.07-11-2013
20130176210IMAGE VALIDATION SYSTEM FOR REMOTE DISPLAYS - Systems, methods and devices are disclosed that provided for providing remote video feedback. The disclosure herein includes a transmitting computing device that is configured to produce an image frame encoded with metadata and to transmit the encoded image frame over a network to a remote display device. The remote display device includes a viewing screen contained in a chassis with bevel. The viewing screen includes an information validation pixel with the chassis extending over and covering the image validation pixel. The chassis also includes a photo sensor positioned on the under side of the bezel so as to be in operational view of the image validation pixel. The system also includes a sensor electronics module that is configured to detect the metadata when the metadata is displayed by the information content pixel and is received by the photo sensor. The sensor electronics module also generates image frame feedback information based upon the detecting. A remote computing device is included that is configured to receive the encoded image frame from the transmitting computer, render the encoded image frame upon the viewing screen and re-transmit the image frame feedback information to the transmitting computer from the sensor electronics module.07-11-2013
20130176211IMAGE PROCESSING DEVICE FOR DISPLAYING MOVING IMAGE AND IMAGE PROCESSING METHOD THEREOF - Data of a moving image has a hierarchical structure comprising a 0-th layer, a first layer, a second layer, and a third layer in a z axis direction. Each layer is composed of moving image data of a single moving image expressed in different resolutions. Both the coordinates of a viewpoint at the time of the display of a moving image and a corresponding display area are determined in a virtual three-dimensional space formed by an x axis representing the horizontal direction of the image, a y axis representing the vertical direction of the image, and a z axis representing the resolution. By providing a switching boundary for layers with respect to the z axis, the layers of the moving image data used for frame rendering are switched in accordance with the value of z of the frame coordinates.07-11-2013
20120249412SUSPENDED INPUT SYSTEM - Provided herein are input devices, systems, and methods. Some of the embodiments provided herein employ magnetic levitation of a controller of the input system so as to allow various benefits to a user's experience of the input system10-04-2012
20120249409METHOD AND APPARATUS FOR PROVIDING USER INTERFACES - An apparatus, method, and computer program product are provided for generating a projected user interface. The apparatus may include at least one processor and at least one memory including computer program code. The at least one memory and the computer program code may be configured, with the processor, to cause the apparatus to receive information regarding a detected position of the user's body and to determine whether the detected position is an activation position, in which case the projection of a user interface may be provided. The user interface may be projected on an area on the user's body, such as a hand or a forearm, or on the surface of an object. The activation position may thus be a predefined position of the user's body in which effective projection of the user interface onto the surface and user interaction with the user interface is facilitated.10-04-2012
20130135189GESTURE-RESPONSIVE USER INTERFACE FOR AN ELECTRONIC DEVICE HAVING A COLOR CODED 3D SPACE - This disclosure provides systems, methods and apparatus, including computer programs encoded on computer storage media, for three dimensional position determination of an object. In one aspect, a first electromagnetic (EM) radiation and a second EM radiation is emitted, toward a position-sensing volume, the first and second EM radiation each having a respective, different, wavelength. Scattered radiation, resulting from interaction of the emitted first and second EM radiation with an object located in the position-sensing volume, is detected. Characteristics of the detected scattered radiation have a correlation with a position of the object in the position-sensing volume. A three dimensional position of the object is determined, from the correlation. The position-sensing volume may be proximate to, and extend above, an external surface of a display screen.05-30-2013
20130135197TRANSMISSIVE DISPLAY APPARATUS AND OPERATION INPUT METHOD - A transmissive display apparatus includes an operation section that detects an operational input issued through an operation surface, an image output section for the left eye and an image output section for the right eye that output predetermined image light, an image pickup section that visually presents the predetermined image light in an lavage pickup area that transmits externally incident light, a sensor that outputs a signal according to a positional relationship between the operation surface and the image pickup area, and a determination section that detects overlap between an optical image of the operation surface that has passed through the image pickup area and the predetermined image in the image pickup area based on the signal according to the positional relationship and receives the operational input when the optical image of the operation surface overlaps with the predetermined image.05-30-2013
20130113696USER INTERFACE DISPLAYING FILTERED INFORMATION - A trigger event is set, based on information in an information flow. One or more actions are set to occur in response to occurrence of the trigger event. The information flow is received. The set one or more actions are performed upon occurrence of the trigger event, and a user interface is displayed based on the information flow. The actions include, but are not limited to, filtering display of information from the information flow in response to occurrence of the trigger event.05-09-2013
20130201102MOBILE COMMUNICATION DEVICE WITH THREE-DIMENSIONAL SENSING AND A METHOD THEREFORE - A mobile communication device and a method for three-dimensional sensing of objects in a spatial volume above a display of the mobile communication device. The mobile communication device comprises input means having at least two sensors configured for collecting data about objects in said spatial volume and a processing logic for processing spatial object data. The method comprises and the mobile communication device in configured to perform the following steps; receiving an detection signal from at least one of the sensors indicating that an object is present above the display; determining the distance to the detected object; looking up weight parameters associated with each sensor in a look up table, said weight parameters being dependable on the determined distance; collecting data about the detected object from each sensor; and calculating the position of the detected object by using the collected data together with the looked up weight parameters.08-08-2013
20130093669APPARATUS AND METHOD FOR CONTROLLING SCREEN IN WIRELESS TERMINAL - An apparatus and method for controlling a screen in a wireless terminal including a bendable display unit for easily controlling the screen, the apparatus including a bendable display unit, a bending detector for detecting bending of the bendable display unit, and a controller for controlling the bendable display unit to up-scale and display data in a non-bent screen region on the bendable display unit when the bendable display unit is bent.04-18-2013
20130093668METHODS AND APPARATUS FOR TRANSMITTING/RECEIVING CALLIGRAPHED WRITING MESSAGE - Methods and apparatus are provided for transmitting and receiving a calligraphed writing message. Writing data is receiving as input at a transmitting apparatus. The writing data is sampled to generate character frame data having a plurality of point data. Calligraphy control point data is generated using the character frame data. The calligraphed writing message including the calligraphy control point data is generated and transmitted to a receiving apparatus. A calligraphy outline is generated at the receiving apparatus for generation of a calligraphed writing image using the calligraphy control point data from the calligraphed writing message. Graphic processing is performed on the calligraphy outline. The calligraphed writing image is generated and displayed.04-18-2013
20130093667METHODS AND DEVICES FOR MANAGING VIEWS DISPLAYED ON AN ELECTRONIC DEVICE - A mobile device is disclosed. The mobile device may include a display device configured to generate a first graphical configuration at a first orientation of the mobile device and a second graphical configuration at a second orientation of the mobile device, at least one input device, and an input linking module configured to selectively link the at least one input device to either the first or second graphical configuration based on an orientation of the mobile device.04-18-2013
20130093664DRAWING DEVICE, DRAWING CONTROL METHOD, AND DRAWING CONTROL PROGRAM FOR DRAWING GRAPHICS IN ACCORDANCE WITH INPUT THROUGH INPUT DEVICE THAT ALLOWS FOR INPUT AT MULTIPLE POINTS - A game device includes: an input position acquiring unit for acquiring the position of an input entry to a touch panel capable of concurrently detecting input entries at a plurality of points; a drawing start determination unit for determining, when the respective positions of concurrent input entries to the touch panel at two points are acquired, to start drawing a graphic if a distance between the two points is within a first range; a drawing unit for drawing, while the input entries at the two points continue to be input after drawing a graphic is started, a graphic calculated based on the respective movement trajectories of the two points and displaying the graphic on a display device; and a drawing end determination unit for determining to end drawing the graphic when the input of the input entries at the two points ends.04-18-2013
20130093663LIGHT DEFLECTOR AND LIQUID CRYSTAL DISPLAY DEVICE USING THE SAME - A light deflector capable of deflecting light in a predetermined deflection direction and modulating the angle of deflection of light includes a plurality of liquid crystal deflection elements arranged in the predetermined deflection direction. In at least one pair of adjacent liquid crystal deflection elements, the dimension of one of the liquid crystal deflection elements in the predetermined deflection direction is different from the dimension of the other liquid crystal deflection element in the predetermined deflection direction.04-18-2013
20130093662SYSTEM AND METHOD OF MODE-SWITCHING FOR A COMPUTING DEVICE - A first device such as a portable computing device can be configured to act as a text-entry device (in a text-entry mode) and a cursor control device (in a cursor control mode) for a second device. The first device can include a touch-sensitive display capable of receiving text inputs and cursor inputs for controlling the display of a second device which is communicatively coupled to the first device. The first device can be configured such that selection of certain items displayed by the second device can cause the first device to switch from a text-entry mode to a cursor control mode. The first device can be configured such that rotation of the device between a landscape orientation and a portrait orientation causes the device to switch between modes. The first device can be configured such that sideways movement of the device causes the device to switch between modes.04-18-2013
20130093660METHOD AND SYSTEM TO CONTROL A PROCESS WITH BEND MOVEMENTS - Various embodiments include devices, methods, circuits, data structures, and software that allow for control of a process through detecting of movements in bends in a flexible display device. In an example, a method for controlling a process through bend movements can include detecting movement of a bend in a deformable display panel and modifying a process running on a computing device response to detecting the movement of the bend.04-18-2013
20130093659AUTOMATIC ADJUSTMENT LOGICAL POSITIONS OF MULTIPLE SCREEN - A method and a computer system are provided for automatically setting the logical positions of multiple screen displays. A computer system may comprise a plurality of display devices, at least one image capturing device, and a controller. The controller may be coupled to the display devices and image capturing devices. The adjustment module may be adapted to adjust the plurality of the display settings.04-18-2013
20130093665METHOD AND APPARATUS FOR PROCESSING VIRTUAL WORLD - A virtual world processing apparatus and method. An angle value is obtained by measuring an angle of a body part of a user of a real world using sensor capability, which is information on capability of a bending sensor, and is transmitted to a virtual world, thereby achieving interaction between the real world and the virtual world. In addition, based on the sensor capability and the angle value denoting the angle of the body part, control information is generated to control a part of an avatar of the virtual world, corresponding to the body part, and then transmitted to the virtual world. Accordingly, interaction between the real world and the virtual world is achieved.04-18-2013
20130113702MOBILE TERMINAL AND OPERATION CONTROL METHOD - There is provided a mobile terminal making it possible to normally use applications developed for other mobile terminals performing different screen display mode switching control. In application section 05-09-2013
20130113701IMAGE GENERATION DEVICE - An image generation device 05-09-2013
20130093666PROJECTOR AND IMAGE DRAWING METHOD - A projector includes a recognition section adapted to recognize an image to be drawn and a drawing area in which the image is drawn based on a trajectory of a symbol written to a projection surface, an acquisition section adapted to acquire image data representing the image recognized by the recognition section, and a drawing control section adapted to perform control so that the image represented by the image data acquired by the acquisition section is drawn in the drawing area of the projection surface recognized by the recognition section.04-18-2013
20130100011HUMAN-MACHINE INTERFACE DEVICE - A human-machine interface device suitable for being electrically connected to an electronic device. The human-machine interface device includes a flexible carrier having at least one flexible portion, a bending sensor, and a control module. The bending sensor is disposed on the flexible portion of the flexible carrier. The control module is disposed on the carrier, connected to the bending sensor, and electrically connected to the electronic device. A first operation signal from the bending sensor is transmitted to the electronic device through the control module so that the electronic device performs according to the first operation signal.04-25-2013
20130100014DEVICE FOR HANDLING BANKNOTES - The invention relates to a device (04-25-2013
20130100013PORTABLE TERMINAL AND METHOD OF SHARING A COMPONENT THEREOF - A portable terminal capable of sharing at least a portion of a component, and a method therefor, are provided. The portable terminal includes an interface unit configured to connect with another portable terminal and a controller configured to activate a sharing mode in which at least a portion of a component is shared with the other portable terminal when connected with the other portable terminal through the interface unit.04-25-2013
20130113700OPERATING AND MONITORING SCREEN DISPLAYING APPARATUS AND METHOD - An operating and monitoring screen displaying apparatus according to the present disclosure selects a screen among a plurality of split display screens displaying information about a part of a process in a plant and displays the selected screen as an operating and monitoring screen. The apparatus includes a receipt unit which receives a selection of a screen to be displayed among a plurality of split display screens; a split screen display unit which displays the screen selected through the receipt unit; and a map display unit which displays a map in which process elements constituting the process are associated with display areas, and indicates, on the map, a display area associated with a process element included in the screen being displayed by the split screen display unit.05-09-2013
20130113699APPARATUS AND METHOD FOR CONTROLLING CONTROLLABLE DEVICE IN PORTABLE TERMINAL - An apparatus and method control a controllable device in a portable terminal. The apparatus includes a device register unit, an input unit, a memory unit, a gesture detector, a controller, and a contents transmitter. The device register unit registers controllable devices to partial regions of an output screen. The input unit generates input data for controllable device registration or user's input data for controllable device selection. The memory unit stores information of the controllable devices registered by the device register unit. The gesture detector senses a user's input capable of determining direction. The controller detects a controllable device corresponding to the user's input. The contents transmitter transmits contents to the controllable device detected by the controller.05-09-2013
20130113698STORAGE MEDIUM, INPUT TERMINAL DEVICE, CONTROL SYSTEM, AND CONTROL METHOD - An input terminal device of one example includes a CPU, and the CPU transmits operation data to a game apparatus executing game processing in response to an operation by a user. Furthermore, when a TV remote control button provided to the input terminal device is operated, the CPU sets a TV control mode capable of operating a television, to thereby display a TV remote controller image on an LCD of the input terminal device. By using this image, the user inputs operation data of the television.05-09-2013
20130113697DISPLAY CONTROL DEVICE - A display control device is disclosed for at least a computer system to control one or more display devices. The display control device includes a processing unit, a memory unit, a connection module, and a video control unit. The memory unit records the display control data generated by the processing unit. The connection module is for connecting the display devices, and for transmitting the video data generated by the computer system when reading the display control data to the display devices. The video control unit is coupled to the computer system, the connection module, and the processing unit. The video control unit is controlled by the processing unit for selecting the display device connected with the connection module, and outputs the video data generated by the computer system to the selected display device through the connection module. Therefore, the computer system may communicate to display devices having different standards.05-09-2013
20130100010Control Method and System of Brain Computer Interface with Stepping Delay Flickering Sequence - A control method of a brain computer interface (BCI) with a stepping delay flickering sequence is provided. First, a plurality of different flickering sequence signals are generated by encoding a static flashing segment and a plurality of stepping delay flashing segments divided in different time sequences. Then, a plurality of target images corresponding to the flickering sequence signals are displayed. Thereafter, a response signal generated by an organism evoked by the target images is acquired. Then, signal processing is performed on the response signal by using a mathematic method to distinguish which one of the target images is gazed by the organism. Thereafter, a controlling command corresponding to one of the target images is generated. A control system of a BCI with a stepping delay flickering sequence is also provided herein.04-25-2013
20110267260INTERACTIVE DISPLAY APPARATUS AND OPERATING METHOD THEREOF - Provided are an interactive display apparatus and method for operating an interactive display apparatus, the apparatus including: a capturing unit operable to capture an image including a displayed image which is displayed on a display screen and a light of a pointing device irradiated onto the display screen; a memory which stores the captured image; a detection unit which determines coordinate information for the light of the pointing device irradiated onto the display screen based on the captured image; and a controller which determines whether the light of the pointing device is moved outside of the displayed image based on the coordinate information, and controls the apparatus to perform a predetermined operation from among a plurality of predetermined operations if the light of the pointing device is moved outside of the displayed image.11-03-2011
20130120238LIGHT CONTROL METHOD AND LIGHTING DEVICE USING THE SAME - There is herein described a light output control method for a controlling a lighting device by a motion of an object near an environment, the lighting device comprising a video sensor and a light-emitting unit, the light output control method comprising steps of emitting an infrared light onto at least a part of the object and at least a part of the environment, collecting the infrared light reflected by at least the part of the object and at least the part the environment as a two-dimensional depth data sequence of the video sensor, computing the motion of the object by utilizing the two-dimensional depth data sequence, and controlling the light-emitting unit to change an attribute of the output light if the motion of the object complies with a predetermined condition.05-16-2013
20130120237Systems and Methods for Synthesizing High Fidelity Stroke Data for Lower Dimension Input Strokes - Systems and methods for synthesizing paintbrush strokes may use high fidelity pose data of reference strokes to supplement lower dimension stroke data. For example, 6DOF data representing reference strokes created by skilled artists may be captured and stored in a library of reference strokes. A query stroke made by a less skilled artist/user and/or made using input devices that do not provide 6DOF data may be processed by a stroke synthesis engine to produce an output stroke that follows the trajectory of the query stroke and includes pose data from one or more reference strokes. The stroke synthesis engine may construct feature vectors for samples of reference strokes and query strokes, select the best neighbor feature vector from the library for each query stroke sample, assign the pose data of the best neighbor to the query sample, and smooth the sequence of assigned poses to produce the synthesized output stroke.05-16-2013
20130127707Method, an Apparatus and a Computer Program for Controlling an Output from a Display of an Apparatus - A method including: displaying information corresponding to a first output state, temporarily displaying information corresponding to a second output state while a user actuation is occurring; and displaying information corresponding to the first output state when the user actuation is no longer occurring.05-23-2013
20130127708CELL-PHONE BASED WIRELESS AND MOBILE BRAIN-MACHINE INTERFACE - Techniques and systems are disclosed for implementing a brain-computer interface. In one aspect, a system for implementing a brain-computer interface includes a stimulator to provide at least one stimulus to a user to elicit at least one electroencephalogram (EEG) signal from the user. An EEG acquisition unit is in communication with the user to receive and record the at least one EEG signal elicited from the user. Additionally, a data processing unit is in wireless communication with the EEG acquisition unit to receive and process the recorded at least one EEG signal to perform at least one of: sending a feedback signal to the user, or executing an operation on the data processing unit.05-23-2013
20130127703Methods and Apparatus for Modifying Typographic Attributes - Methods and apparatus for various embodiments of a text adjustment tool provide direct modification of typographic attributes of displayed text by translating indications of movement across a display, received via user input, into typographic attribute modifications. The text adjustment tool allows a user an intuitive method for modifying typographic attributes of displayed text without a user needing to access any menu windows or without even knowing which typographic attribute is being modified. The intuitive nature of the text adjustment tool is reflected by the fact that at no point during a modification of a typographic attribute of targeted text does a user's eyes ever need to leave the text being modified.05-23-2013
20130127705APPARATUS FOR TOUCHING PROJECTION OF 3D IMAGES ON INFRARED SCREEN USING SINGLE-INFRARED CAMERA - The present invention relates to an apparatus for touching a projection of a 3D image on an infrared screen using a single-infrared camera, and more specifically to an apparatus for touching a projection of a 3D image, which projects an image in a free space; recognizes a position touched by a user on the projected image and thus can process an order from a user on the basis of the recognized touched position. The present invention can provide tangible and interactive user interfaces to users. In particular, it is possible to implement various UIs (User Interface), in comparison to an apparatus for touching a projection of a 2D image of the related art, by using the Z-axial coordinate on the infrared screen as the information on depth.05-23-2013
20130127704SPATIAL TOUCH APPARATUS USING SINGLE INFRARED CAMERA - The present disclosure relates to a spatial touch apparatus using a single infrared camera. More particularly, it relates to a spatial touch apparatus to implement a virtual touch screen in a free space by using an infrared light emitting diode (LED) array and a single infrared camera and to calculate X-axis and Z-axis coordinates of the infrared screen touched by a user indication object. Therefore, the present invention will provide tangible and interactive user interfaces to users and can implement more various user interfaces (UI) than a conventional 2D touch apparatus.05-23-2013
20130135193ELECTROMECHANICAL SYSTEMS DISPLAY APPARATUS INCORPORATING CHARGE DISSIPATION SURFACES - This disclosure provides systems, methods and apparatus for dissipating charge buildup within a display element with a conductive layer. The conductive layer is maintained in electrical contact with a fluid within the display element. The fluid, in turn, remains in contact with light modulators within the display elements. Any charge buildup that may be caused by the filling of the fluid during fabrication of the display device, or during operation of the light modulators can be dissipated by the conductive layer. Thus, by dissipating the charge buildup, the conductive layer reduces or eliminates electrostatic forces due to the charge buildup that may affect the operability of the light modulators. The display can include conductive spacers in an active display region of the display and a spacer-free region that allows the substrates to deform while retaining an electrical connection between the conductive layer and the spacers in the active display region.05-30-2013
20130135188GESTURE-RESPONSIVE USER INTERFACE FOR AN ELECTRONIC DEVICE - This disclosure provides systems, methods and apparatus, including computer programs encoded on computer storage media, for providing a gesture-responsive user interface for an electronic device. In one aspect, an apparatus or electronic device has an interactive display that provides an input/output (I/O) interface to a user of the apparatus. The apparatus includes a processor, a light emitting source and at least two light sensors. A secondary optical lens structures emitted light from the light emitting source into at least one lobe. Each light sensor outputs, to the processor, a signal representative of a characteristic of received light, where the received light results from scattering of the structured emitted light by an object. The processor effectuates the I/O interface by recognizing, from the output of the light sensors, an instance of a user gesture, and to control the interactive display and/or the apparatus in response to the user gesture.05-30-2013
20130135192GESTURE RECOGNITION APPARATUS, METHOD THEREOF AND PROGRAM THEREFOR - A gesture recognition apparatus detects a locus of a fingertip position of a user from an acquired moving image; sets an effective range configured to set an effective range to detect a locus of the fingertip position of the user from the moving image flap action; determines whether or not the locus of the fingertip position is of the flap action when the locus of the fingertip position is included in the effective range; and recognizes a gesture of the user from the flap action when the locus of the fingertip position is of the flap action.05-30-2013
20130135200Electronic Device and Method for Controlling Same - Provided is an electronic device whereby input characters can be used in a desired application by a simple operation, and a method for controlling the electronic device. While on screen, if a leftward shaking operation is detected again by a motion sensor, a control unit displays, instead of the screen of a notepad application, a browser application screen of a browser application different from the notepad application. While on screen, if a detecting unit detects the operation of a user's finger being lifted from the surface of a display unit, the control unit inputs the input text “BOU-SUI MOBILE”, displayed on the display unit, in a search box on the browser application and launches the browser application.05-30-2013
20130135199SYSTEM AND METHOD FOR USER INTERACTION WITH PROJECTED CONTENT - A system and method for user interaction with projected content are provided. A computer generated image is projected onto a surface, the computer generated image comprising at least one symbol. The projected computer generated image is imaged to obtain a sensor image. The location of the symbol within the sensor image is detected and based on the location of the symbol in the sensor image a device may be controlled.05-30-2013
20130135196METHOD FOR OPERATING USER FUNCTIONS BASED ON EYE TRACKING AND MOBILE DEVICE ADAPTED THERETO - An eye tracking based user function controlling method and a mobile device adapted thereto are provided. A camera unit of a mobile device is activated while a specific user function is executed. A gaze angle of a user's eye is acquired from an image obtained via the camera unit. An eye tracking function is executed in which execution state is controlled according to the gaze angle.05-30-2013
20130141327GESTURE INPUT METHOD AND SYSTEM - A gesture input method is provided. The method is used in a gesture input system to control a content of a display. The method includes: capturing, by a first image capturing device, a hand of a user and generating a first grayscale image; capturing, by a second image capturing device, the hand of the user and generating a second grayscale image; detecting, by an object detection unit, the first and second grayscale images to obtain a first imaging position and a second imaging position corresponding to the first and second grayscale images, respectively; calculating, by a triangulation unit, a three-dimensional space coordinate of the hand according to the first imaging position and the second imaging position; recording, by a memory unit, a motion track of the hand formed by the three-dimensional space coordinate; and recognizing, by a gesture determining unit, the motion track and generating a gesture command.06-06-2013
20130141324USER INTERFACE CONTROL BASED ON HEAD ORIENTATION - Embodiments distinguish among user interface elements based on head orientation. Coordinates representing a set of at least three reference points in an image of a subject gazing on the user interface elements are received by a computing device. The set includes a first reference point and a second reference point located on opposite sides of a third reference point. A first distance between the first reference point and the third reference point is determined. A second distance between the second reference point and the third reference point is determined. The computing device compares the first distance to the second distance to calculate a head orientation value. The computing device selects at least one of the user interface elements based on the head orientation value. In some embodiments, the head orientation value enables the user to navigate a user interface menu or control a character in a game.06-06-2013
20130141325PORTABLE DEVICE PAIRING WITH A TRACKING SYSTEM - In embodiments of portable device pairing with a tracking system, a pairing system includes a portable device that generates device acceleration gesture data responsive to a series of motion gestures of the portable device. The pairing system also includes a tracking system that is configured for pairing with the portable device. The tracking system recognizes the series of motion gestures of the portable device and generates tracked object position gesture data. A pairing service can then determine that the series of motion gestures of the portable device corresponds to the series of motion gestures recognized by the tracking system, and communicate a pairing match notification to both the tracking system and the portable device to establish the pairing.06-06-2013
20130141329Exchanging Information Between Devices In A Medical Environment - A medical device including an operation determiner for determining operations to be performed by the medical device in response to a gesture of the device with respect to the medical device, and an operation data accessor for accessing operation data for the operation performed by the medical device.06-06-2013
20130141330STEREOSCOPIC DISPLAY DEVICE - An object of the present invention is to provide a novel stereoscopic display device that allows the viewer to properly view a stereoscopic image. The device includes: a display configured to display an image for stereoscopic viewing; an imaging unit configured to image a face of a viewer; a position information acquisition unit configured to acquire position information regarding the face imaged by the imaging unit; an operation unit configured to be operated by the viewer when the viewer is in an optimal position from where the image for stereoscopic viewing displayed on the display can be properly viewed as a stereoscopic image; an optimal position information storage unit configured to store position information provided when the operation unit is operated as position information on the optimal position; and a positional relationship notification unit configured to notify the viewer of the positional relationship between the current position of the viewer and the optimal position.06-06-2013
20080204403Transaction System, Method of Verifying a User's Authorisation to Carry Out a Transaction and Cash Dispenser - A transaction system comprises an input device for inputting of data by a user, and at least one screen which is visible during use of the input device. The system is arranged for displaying an area located at least partially behind a user facing the input device on the screen at least during use of the input device. The transaction system comprises at least one camera directed at said area as well as a device for displaying images recorded by at least one of the cameras at least partially on the screen.08-28-2008
20080204402USER INTERFACE DEVICE - A UI device of a digital camera has a display screen, a left-side touch strip provided outside of and on a left side of the display screen and a right-side touch strip provided outside of and on a right side of the display screen. The touch strips 08-28-2008
20110215999HAND-HELD ELECTRONIC DEVICE - A hand-held electronic device with a keyboard, thumbwheel, display and associated software is optimized for use of the device with the thumbs. The associated software has a plurality of features to optimize efficient use of the limited keyboard space and encourage the use of the device by thumb-based data entry through the thumbwheel and/or through a combination of minimal number of keystrokes. Software features include international character scrolling, and auto-capitalization. The keys on the device keyboard are optimally shaped and configured for thumb-based input. In addition, the thumbwheel is inclined between the front and a side edge of the device so as to be reachable by either the thumb or index finger of the user's hand at the side edge of the device.09-08-2011
20110215997METHOD AND APPARATUS FOR PROVIDING FUNCTION OF PORTABLE TERMINAL USING COLOR SENSOR - A method for providing a function of a portable terminal is provided. The method includes activating a color sensor upon execution of an application, displaying a color recognized by the color sensor on screen data corresponding to the executed application, and controlling a function based on a color recognized by the executed application.09-08-2011
20130181894DISPLAY DEVICE AND METHOD FOR LARGE SCREEN - A display device includes a display screen, a distance sensing unit, set on an appropriate position of the display screen to sense a relative distance from a user to the display device; a processing unit comprising a projection position determining module, a stretching area determining module, and a display control module. The projection position determining module calculates a projected position of the user on the display screen according to a position of the distance sensing unit on the display screen, the projected position is the position on the display screen where the user is projected on. The stretching area determining module determines an area needed to be stretched on the display screen according to the projected position on the display screen and a predetermined distance. The display control module stretches content displayed on the area needed to be stretched according to a predetermined percentage. The display method is also provided.07-18-2013
20130147706MOBILE TERMINAL AND CONTROL METHOD THEREOF - A control method of a mobile terminal is disclosed. The control method of a mobile terminal includes: acquiring a pressure signal through a pressure sensing module for sensing a change in pressure applied to at least one part of the body in at least two degrees; and generating an event for changing a display state of a display unit through a control signal to be matched to the pressure signal. Emotional quality can be improved by changing a display state of a display unit in response to a change in pressure applied to the body.06-13-2013
20080198129Indication position calculation system, indicator for indication position calculation system, game system, and indication position calculation method - An imaging section successively outputs light-reception information relating to each pixel from a starting pixel SP to an end pixel EP, the starting pixel SP being provided on one end of an image PC acquired by the imaging section and the end pixel EP being provided on the other end of the image PC. The imaging section is provided in an indicator so that the starting pixel SP is disposed on a lower side and the end pixel EP is disposed on an upper side when the indicator is held in a reference position.08-21-2008
20080198128KVM SWITCH CAPABLE OF PROVIDING EDID OF DISPLAY FOR COMPUTER COUPLED THERETO AND METHOD THEREOF - A KVM switch capable of providing real EDID of a display, rather than default EDID, to a computer coupled thereto. The KVM switch includes a processor, at least one memory and at least one switch. The processor queries EDID of the display when the KVM switch is booted and stores the EDID in the memory. The switch couples the computer to the display or the memory to provide the EDID of the display or from the memory to the computer. When the EDID is not available from the memory, the switch couples the computer to the display. When the EDID is not available from the display, the switch couples the computer to the memory. The KVM switch further includes a multiplexer. The multiplexer couples the processor to at least one memory for transferring the EDID of the display to the memory at a time.08-21-2008
20110221671METHOD AND APPARATUS FOR AUDIO BIOMETRIC DATA CAPTURE - A method and apparatus for audio biometric data capture are provided. The apparatus includes in interactive head-mounted eyepiece worn by a user that includes an optical assembly through which a user views a surrounding environment and displayed content. The optical assembly comprises a corrective element that corrects the user's view of the surrounding environment and an integrated processor for handling content to the user. An integrated optical sensor captures visual biometric data when the eyepiece is positioned so that a nearby individual is proximate to the eyepiece. Audio biometric data is captured using multiple microphones mounted in an endfire array in the eyepiece. The remote processing facility interprets the captured audio biometric data and generates display content based on the interpretation. This display content is delivered to the eyepiece and displayed to the user.09-15-2011
20110221670METHOD AND APPARATUS FOR VISUAL BIOMETRIC DATA CAPTURE - A method and apparatus for visual biometric data capture are provided. The apparatus includes in interactive head-mounted eyepiece worn by a user that includes an optical assembly through which a user views a surrounding environment and displayed content. The optical assembly comprises a corrective element that corrects the user's view of the surrounding environment and an integrated processor for handling content to the user. An integrated optical sensor captures visual biometric data when the eyepiece is positioned so that a nearby individual is proximate to the eyepiece. Visual biometric data is captured using the eyepiece and is transmitted to a remote processing facility for interpretation. The remote processing facility interprets the captured visual biometric data and generates display content based on the interpretation. This display content is delivered to the eyepiece and displayed to the user.09-15-2011
20110221668PARTIAL VIRTUAL KEYBOARD OBSTRUCTION REMOVAL IN AN AUGMENTED REALITY EYEPIECE - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content. The displayed content is an interactive control element. An integrated camera facility of the eyepiece images a user's body part as it interacts with the interactive control element, wherein the processor removes a portion of the interactive control element by subtracting the portion of the interactive control element that is determined to be co-located with the imaged user body part based on the user's view.09-15-2011
20110221665REMOTE CONTROLLER AND CONTROL METHOD THEREOF, DISPLAY DEVICE AND CONTROL METHOD THEREOF, DISPLAY SYSTEM AND CONTROL METHOD THEREOF - A display system, a display device, and a remote controller for controlling the display device are provided. The remote controller includes: a touch screen which receives an input from a user and display a first manipulation UI group including a shortcut key corresponding to a plurality of buttons to control the display device; a signal output unit which outputs a control signal to the display device based on an input to the touch screen; and a controller which, in response to a user's selection of the shortcut key, displays on the touch screen a second manipulation UI group and parts of the first manipulation UI group, the second manipulation UI group displaying the plurality of buttons.09-15-2011
20110221664VIEW NAVIGATION ON MOBILE DEVICE - Users may view web pages, play games, send emails, take photos, and perform other tasks using mobile devices. Unfortunately, the limited screen size and resolution of mobile devices may restrict users from adequately viewing virtual objects, such as maps, images, email, user interfaces, etc. Accordingly, one or more systems and/or techniques for displaying portions of virtual objects on a mobile device are disclosed herein. A mobile device may be configured with one or more sensors (e.g., a digital camera, an accelerometer, or a magnetometer) configured to detect motion of the mobile device (e.g., a pan, tilt, or forward/backward motion). A portion of a virtual object may be determined based upon the detected motion and displayed on the mobile device. For example, a view of a top portion of an email may be displayed on a cell phone based upon the user panning the cell phone in an upward direction.09-15-2011
20130093670IMAGE GENERATION DEVICE, METHOD, AND INTEGRATED CIRCUIT - When a television screen is split into sub-screens and the sub-screens are allocated to a plurality of operators, a television appropriately controls display positions and sizes of the sub-screens according to a position relationship between the operators, distances between the operators and the sub-screens, and rearrangement of the positions of the operators. Specifically, the television includes an external information obtaining unit that obtains position information items of the gesture operations and a generation unit that generates an image in a layout set based on a relative position relationship between the position information items.04-18-2013
20130135195SYSTEMS FOR CONTROLLING AN ELECTRIC DEVICE USING A CONTROL APPARATUS - An electrical switch apparatus including a movement sensitive form is disclosed. The apparatus includes a housing, a motion sensor and a processing unit, where motion on, near or about the motion sensor is translated into output commands adapted for list scrolling, where the list can be arranged in a hierarchy such as menus or for changing a value of an attribute of an electrical device under the control of the switch.05-30-2013
20130147703DISPLAY DEVICE WITH OPTION INTERACTION - A display device with option interaction includes a display device and is characterized in that the display device is provided with an interaction device that is connected to a signal source. When an image is displayed on the display device to proceed with option interaction, the interaction device receives an option signal of the display device and provides backward transmission so as to provide the display device with a mechanism for interacting with a consultative program, making it easy to proceed with interaction for questionnaire, sampling, and voting, thereby allowing the user of the display device to participate in opinion expression for public affairs and improving efficiency, fairness, and impartiality in sampling society opinions.06-13-2013
20130147701METHODS AND DEVICES FOR IDENTIFYING A GESTURE - Methods and mobile electronic devices for identifying a gesture on a mobile electronic device are provided. In one embodiment, the method includes: obtaining camera data of a subject from a camera on the mobile electronic device; obtaining device movement data from a sensor on the mobile electronic device, the device movement data representing physical movement of the mobile electronic device; and based on the camera data and the device movement data, interpreting movement of the subject as a predetermined input command associated with a predetermined gesture when movement of the subject captured by the camera corresponds to the predetermined gesture.06-13-2013
20130147702Method, Apparatus, Computer Program and User Interface - A method, apparatus, computer program and user interface wherein the method comprises: detecting a user input at a first apparatus; determining that the user input was also detectable by a second apparatus; and causing a function to be performed where at least part of the function is performed by the first apparatus and at least part of the function is performed by the second apparatus.06-13-2013
20130147704ELECTRONIC DEVICE PROVIDING SHAKE GESTURE IDENTIFICATION AND METHOD - A method to identify a shake gesture using an electronic device. The electronic device reads reference acceleration values from an accelerometer of the electronic device when a user shakes the electronic device to establish the shake gesture. The electronic device generates the shake gesture by a sorting algorithm according to the reference acceleration values and associates the shake gesture with an application. The electronic device starts the application corresponding to the shake gesture.06-13-2013
20130147705DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A display apparatus and control method are provided. The display apparatus includes a storage unit which is provided to store biometric information of an individual user and metadata corresponding to the biometric information; a biometric information input unit through which biometric information of a plurality of users is input; a display unit; and a controller which determines metadata corresponding to a first user by comparing the input biometric information of the first user with the stored biometric information, determines metadata corresponding to a second user by comparing the input biometric information of the second user with the stored biometric information, and generates information of a user group comprising the first and second users based on common metadata between the metadata of the first user and the metadata of the second user.06-13-2013
20100283723STORAGE MEDIUM STORING INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - A game apparatus includes a CPU, the CPU judges a motion of a player on the basis of a cycle of a load value input from a load controller, and selectively displays an animation of a player object according to the motion. Furthermore, the CPU controls a moving amount or a moving velocity of the player object on the basis of the load value input from the load controller, and controls a moving direction of the player object according to a barycentric position of the player detected by the load controller.11-11-2010
20100283722ELECTRONIC APPARATUS INCLUDING A COORDINATE INPUT SURFACE AND METHOD FOR CONTROLLING SUCH AN ELECTRONIC APPARATUS - An electronic apparatus includes a coordinate input surface on which at least a finger of a user can be placed, a first position estimating unit and a second position obtaining unit. The first position estimating unit is for estimating the position, here referred to as first position, of at least one object placed on the coordinate input surface. The second position obtaining unit is for obtaining an estimation of the position, here referred to as second position, at which a user is looking on the coordinate input surface. The apparatus is configured to be controlled at least based on the combination of the estimated first position and the estimated second position. The invention also relates to a system including such an apparatus, a method for controlling such an apparatus, and a computer program therefor.11-11-2010
20120256822LEARNER RESPONSE SYSTEM - There is provided a method, in a collaborative input system comprising a computer and a plurality of user terminals, in which system the user terminals are adapted to communicate with the computer system, the method comprising: defining one or more user groups; and allocating one or more of the plurality of user terminals to the one or more user groups.10-11-2012
20100309117INCLINATION CALCULATION APPARATUS AND INCLINATION CALCULATION PROGRAM, AND GAME APPARATUS AND GAME PROGRAM - An inclination calculation apparatus sequentially calculates an inclination of an input device operable in terms of a posture thereof. The input device includes acceleration detect