Entries |
Document | Title | Date |
20080198128 | KVM SWITCH CAPABLE OF PROVIDING EDID OF DISPLAY FOR COMPUTER COUPLED THERETO AND METHOD THEREOF - A KVM switch capable of providing real EDID of a display, rather than default EDID, to a computer coupled thereto. The KVM switch includes a processor, at least one memory and at least one switch. The processor queries EDID of the display when the KVM switch is booted and stores the EDID in the memory. The switch couples the computer to the display or the memory to provide the EDID of the display or from the memory to the computer. When the EDID is not available from the memory, the switch couples the computer to the display. When the EDID is not available from the display, the switch couples the computer to the memory. The KVM switch further includes a multiplexer. The multiplexer couples the processor to at least one memory for transferring the EDID of the display to the memory at a time. | 08-21-2008 |
20080198129 | Indication position calculation system, indicator for indication position calculation system, game system, and indication position calculation method - An imaging section successively outputs light-reception information relating to each pixel from a starting pixel SP to an end pixel EP, the starting pixel SP being provided on one end of an image PC acquired by the imaging section and the end pixel EP being provided on the other end of the image PC. The imaging section is provided in an indicator so that the starting pixel SP is disposed on a lower side and the end pixel EP is disposed on an upper side when the indicator is held in a reference position. | 08-21-2008 |
20080204401 | Control apparatus - The present invention is to provide a control apparatus, which includes a casing with two sides each comprising a holding portion extending outwardly and towards a bottom thereof for being held by a user while viewing a front of the casing, the front is inclined progressively from a lateral edge of the front away from the user through the bottom to an opposite lateral edge of the front near the user; a direction input portion disposed on the front of the casing near one of the holding portions; an operation input portion disposed on the front of the casing near the other holding portion; and a sensing input portion disposed on the front of the casing between the direction input portion and the operation input portion and proximate to the user. Thus, a user can press or push any input portion easily for an input while holding the two holding portions by hands. | 08-28-2008 |
20080204402 | USER INTERFACE DEVICE - A UI device of a digital camera has a display screen, a left-side touch strip provided outside of and on a left side of the display screen and a right-side touch strip provided outside of and on a right side of the display screen. The touch strips | 08-28-2008 |
20080204403 | Transaction System, Method of Verifying a User's Authorisation to Carry Out a Transaction and Cash Dispenser - A transaction system comprises an input device for inputting of data by a user, and at least one screen which is visible during use of the input device. The system is arranged for displaying an area located at least partially behind a user facing the input device on the screen at least during use of the input device. The transaction system comprises at least one camera directed at said area as well as a device for displaying images recorded by at least one of the cameras at least partially on the screen. | 08-28-2008 |
20080204404 | Method of Controlling a Control Point Position on a Command Area and Method For Control of a Device - The invention describes a method of controlling a position (x′, y′) of a control point (c) on a command area (A | 08-28-2008 |
20080204405 | Method and System of Displaying an Exposure Condition - There is provided a device which may easily and visually judge which chip in an FEM wafer has a normal exposure condition, or which chip has an abnormal exposure condition. | 08-28-2008 |
20080204406 | COMPUTER-READABLE STORAGE MEDIUM HAVING STORED THEREIN INFORMATION PROCESSING PROGRAM AND INFORMATION PROCESSING APPARATUS - A controller includes imaging means for capturing predetermined imaging targets and acceleration detecting means for detecting an acceleration applied to an input device. Based on a tilt which is related to images, included in a captured image captured by the imaging means, of the imaging targets and which is included in the captured image, a game apparatus calculates, as a first tilt, a tilt of the controller which is related to a rotation around an axis of a capturing direction of the imaging means. Further, based on the acceleration detected by the acceleration detecting means, the game apparatus calculates, as a second tilt, a tilt which is related to a rotation around an axis of a direction different from the capturing direction. The game apparatus executes a predetermined process using the first tilt and the second tilt as an orientation of the controller. | 08-28-2008 |
20080204407 | COMPUTER-READABLE STORAGE MEDIUM HAVING STORED THEREIN INFORMATION PROCESSING PROGRAM AND INFORMATION PROCESSING APPARATUS - A controller includes imaging means for capturing predetermined imaging targets and acceleration detecting means for detecting an acceleration applied to an input device. A game apparatus repeatedly calculates a target position which is related to images, included in a captured image captured by the imaging means, of the imaging targets and which is provided in the captured image, and then senses a predetermined motion (a motion in an upward direction) of the input device based on the target position calculated within a predetermined time period. Further, the game apparatus senses the predetermined motion of the input device based on the acceleration detected by the acceleration detecting means. Furthermore, the game apparatus executes a predetermined process (a process of causing a player character to perform a jump) when the predetermined motion has been sensed in at least either one of a first motion determining step and a second motion determining step. | 08-28-2008 |
20080211766 | MULTITOUCH DATA FUSION - A method for performing multi-touch (MT) data fusion is disclosed in which multiple touch inputs occurring at about the same time are received to generating first touch data. Secondary sense data can then be combined with the first touch data to perform operations on an electronic device. The first touch data and the secondary sense data can be time-aligned and interpreted in a time-coherent manner. The first touch data can be refined in accordance with the secondary sense data, or alternatively, the secondary sense data can be interpreted in accordance with the first touch data. Additionally, the first touch data and the secondary sense data can be combined to create a new command. | 09-04-2008 |
20080211767 | SYSTEM FOR PRINTING CODED DATA PROVIDING INTERACTION WITH COMPUTER SOFTWARE - A system for enabling user interaction with computer software. The system includes a printer for receiving print data, printing a form, using the print data, with information related to an interactive element coincident with coded data indicative of the interactive element, receiving indicating data from a sensing device which is generated by the sensing device sensing the coincident coded data so as to be indicative of the interactive element, and transfer the indicating data to a computer system to allow the interaction to be interpreted. The coded data is indicative of an identity. The computer system determines, using the indicating data, the identity, determines, using the identity, a page description, and identifies, using the page description, the interactive element. | 09-04-2008 |
20080218472 | INTERFACE TO CONVERT MENTAL STATES AND FACIAL EXPRESSIONS TO APPLICATION INPUT - A method of interacting with an application includes receiving, in a processor, data generated based on signals from one or more bio-signal detectors on a user, the data representing a mental state or facial expression of the user, generating an input event based on the data representing the mental state or facial expression of the user of the user, and passing the input event to an application. | 09-11-2008 |
20080218473 | Enhanced Artificial Intelligence Language - A method of determining an appropriate response to an input includes linking a plurality of attributes to a plurality of response templates using a plurality of Boolean expressions. Each attribute is associated with a set of patterns. Each pattern within the set of patterns is equivalent. The method also includes determining an appropriate response template from the plurality of response templates based on the input. | 09-11-2008 |
20080224994 | Slide Operation Apparatus - A slide operation apparatus capable of preventing a movable unit from being unintentionally moved in a box body, while being easy to assemble and simple in construction. The movable unit includes a gondola to which an operating element is fixed. The gondola is adapted to be movable relative to upper and lower guide bars. A sliding contact assembly includes a plate spring to which an insulation sheet is assembled. The sliding contact assembly in a curved state is mounted to a fixture portion of the operating element by having pawls of the plate spring engaged with notches formed in the fixture portion. During the entire movement process of the movable unit, the curved convex portion of the sliding contact assembly is in sliding contact with the lower guide bar. | 09-18-2008 |
20080231594 | Haptics Transmission Systems - A method of compensating for network latency in haptics transmission in which the position of a haptic effector is controlled by signals received from a network. The method comprises storing a series of locations of the haptic effector, determining from the series using Fourier Transformation or other means frequencies having a growing amplitude and creating a filter function to eliminate the growing frequencies from output signals directing the force and direction of the haptic effector. | 09-25-2008 |
20080231595 | Remote control apparatus and method of interacting with a multimedia timeline user interface - Remote control devices and methods of interacting with a multimedia timeline user interface are disclosed. A remote control apparatus may include a first selector to make a selection of a graphical user interface associated with a multimedia timeline. The remote control apparatus may include a transmitter to transmit the selection to a multimedia device configured to provide the graphical user interface to a display device. The remote control apparatus may include at least one date range selector to modify a date range displayed on the graphical user interface. | 09-25-2008 |
20080246723 | Integrated button activation sensing and proximity sensing - Apparatuses and methods for coupling a group of sensor elements together in one mode to collectively measure a capacitance on the group of sensor elements, in addition to individually measuring a capacitance on each of the sensor elements in another mode. The touch-sensor buttons may be used individually for button-activation sensing, and the touch-sensor buttons may be used collectively for proximity detection. The touch-sensor buttons and a ground conductor that surrounds the touch-sensor buttons may also be collectively used for proximity detection. The apparatus may include a processing device, and a plurality of sensor elements that are individually coupled in a first mode for button-activation sensing and collectively coupled in a second mode for proximity sensing. | 10-09-2008 |
20080252593 | Information Processing Apparatus, Method and Program - The present invention relates to an information processing apparatus, method and program for controlling highlighting on a screen on the basis of an operation on a touch panel overlaid on a display. A touch-panel operation determining unit | 10-16-2008 |
20080252594 | Vibration Actuator with a Unidirectional Drive - A haptic feedback generation system includes a linear resonant actuator and a drive circuit. The drive circuit is adapted to output a unidirectional signal that is applied to the linear resonant actuator. In response, the linear resonant actuator generates haptic vibrations. | 10-16-2008 |
20080252595 | Method and Device for Virtual Navigation and Voice Processing - An apparatus for virtual navigation and voice processing is provided. A system that incorporates teachings of the present disclosure may include, for example, a computer readable storage medium having computer instructions for processing voice signals captured from a microphone array, detecting a location of an object in a touchless sensory field of the microphone array, and receiving information from a user interface in accordance with the location and voice signals. | 10-16-2008 |
20080252596 | Display Using a Three-Dimensional vision System - An interactive video display system allows a physical object to interact with a virtual object. A light source delivers a pattern of invisible light to a three-dimensional space occupied by the physical object. A camera detects invisible light scattered by the physical object. A computer system analyzes information generated by the camera, maps the position of the physical object in the three-dimensional space, and generates a responsive image that includes the virtual object. A display presents the responsive image. | 10-16-2008 |
20080259022 | Method, system, and graphical user interface for text entry with partial word display - A computer-implemented method for text entry includes receiving entered text from a user, selecting a set of candidate sequences for completing or continuing the sequence, and presenting the candidate sequences to the user, wherein the candidate sequences include partial words. The candidate sequences are identified based on usage frequency weights stored in a tree data structure. A graphical user interface for text entry includes displaying a current input sequence of characters and the identified partial words. | 10-23-2008 |
20080259023 | Method and System of Making a Computer as a Console for Managing Another Computer - A video signal processor encodes a video signal received from the target computer through the computer interface. A peripheral interface controller is coupled to the video signal processor, and transmits the encoded video signal to the console computer in a peripheral interface standard through the console interface. A console module is coupled to the console computer, and decodes the encoded video signal for displaying on the console computer and transmits a manipulation signal to the target computer through the console interface, the peripheral interface controller and the computer interface. A method of making the console computer as a console for managing the target computer is also disclosed. | 10-23-2008 |
20080259024 | METHOD FOR PROVIDING GRAPHICAL USER INTERFACE (GUI) AND ELECTRONIC DEVICE THEREOF - A method for providing a Graphical User Interface (GUI) and an electronic device using the same are provide. The method for providing a GUI includes receiving rotation information from an external input device by sensing movement of the external input device, and changing a display state of information output on a display using the rotation information of the external input device. Therefore, a user can output setup information conveniently, and user's convenience is provided. | 10-23-2008 |
20080259025 | DEVICE FOR USER INTERFACE USING ROTATABLE INPUT DEVICE AND METHOD OF OPERATING USER INTERFACE - Provided are a device for a user interface for efficient navigation and a method of operating the device interface. The device for a user interface includes a rotatable input device which is manipulated by a user, a contact detection device which detects whether the user contacts the rotatable input device and generates a first input signal if it is detected that that the user contacts the rotatable input device, a rotation detection device which detects a rotation of the input device and generates a second input signal if the rotation of the input device is detected, and a control device which performs a first operation based on at least one of the first and second input signals. | 10-23-2008 |
20080266246 | TRAVERSING GRAPHICAL LAYERS USING A SCROLLING MECHANISM IN A PHYSICAL DESIGN ENVIRONMENT - Z-axis display navigation in the design automation process of physical design, development and manufacturing of integrated circuits includes pre-selecting, using a computer mouse connected to a computer workstation processor, viewable graphical layout layers desired by a design layout debugging operator to view during an integrated circuit (IC) design debugging process. After selecting the desired viewable graphic layout layers, the layout operator uses the mouse to traverse the pre-selected viewable graphical layout layers displayed, by changing the input scrolling pattern of the mouse to scroll forward, backward, diagonally and from side to side in a plurality of directions, where a curser on the layout screen will correspondingly move in the plurality of directions in the pre-selected viewable graphical layout layers on the layout screen corresponding to the movement of the mouse by the operator. | 10-30-2008 |
20080266247 | WIRELESS CONTROL OF MULTIPLE COMPUTERS - A system comprises a plurality of computers, a first wireless input device adapted to control any of the computers via wireless communication, and selection logic coupled to the first wireless input device. The selection logic enables a user to select one of the computers to be controlled by the first input device. | 10-30-2008 |
20080266248 | COORDINATE INFORMATION PROVIDING METHOD AND VIDEO APPARATUS THEREOF - A method for providing coordinate information to an external device and a video apparatus thereof. The coordinate information providing method includes receiving coordinate information input by a user from an input device; and transmitting a coordinate information delivery message containing the coordinate information input through the input device, to an external device connected according to a High Definition Multimedia Interface (HDMI) Consumer Electronics Control (CEC) specification. Accordingly, the video apparatus can control the external device by transferring the coordinate information to the external device. | 10-30-2008 |
20080266249 | Medical overlay mirror - Medical overlay mirror methods and related systems. | 10-30-2008 |
20080266250 | METHOD AND APPARATUS FOR DYNAMICALLY ADJUSTING GAME OR OTHER SIMULATION DIFFICULTY - A method for use with a simulation includes running the simulation, receiving information from a control interface used by a user to interact with the simulation, analyzing the received information, forming at least an indication of the user's level of skill based on the analysis of the received information, and adjusting a difficulty level of the simulation based on the indication of the user's level of skill. A storage medium storing a computer program executable by a processor based system and an apparatus for use with a simulation are also disclosed. | 10-30-2008 |
20080273008 | Interactive image game device for exercise and massage - An interactive image game device for exercise and massage couples exercise, massage and game software together to form an interactive exercise and massage device which includes a plurality of sensors controlled by a user's feet or hands and a display unit which displays game pictures to provide interaction so that users can perform exercise, massage and game playing concurrently to enjoy greater pleasure when in use. | 11-06-2008 |
20080273009 | Information Reproducing Apparatus and Method, Dj Device, and Computer Program - An information reproducing apparatus ( | 11-06-2008 |
20080278437 | COPYING DOCUMENTS FROM ELECTRONIC DISPLAYS - A system including a Multi Function Printer/Product/Peripheral (MFP) and a portable computing device adapted to allow automatic copying of documents by the MFP from the portable computing device. The portable computing device includes a display, a plurality of sensors for detecting light, a light detection module and a page changing module. The sensors are positioned to detect light from the MFP and trigger a page change automatically. The MFP in accordance with the present invention includes a scanner with a platen, a page change detection module, an image processing module, a device identification module, and a document matching & retrieval module. The page change detection module is adapted to receive the images presented by the portable computing device on the platen of the scanner for copying. The page change detection module detects changes and causes the MFP to scan and output a copy. The MFP may also use image processing techniques to locate an original version of the document on a server and print that higher quality document, instead of the scan from the portable computing device. | 11-13-2008 |
20080278438 | USER INTERFACE FOR SELECTING A PHOTO TAG - There is disclosed a user interface for selecting a photo tag. In an embodiment, the user interface embodies a method of selecting a photo tag for a tagged photo, comprising: providing a tag entry field for entering a photo tag; in dependence upon a string entered by a user, displaying in a matching tag list any tags from one or more selected tag sources matching the entered string. The method may further comprise displaying a tag type for each tag appearing in the matching tag list. The method may further comprise allowing user selection of a tag in the matching tag list to complete the tag entry field. | 11-13-2008 |
20080278439 | Dual function operation input keys for portable device and operating method therefor - A dual function operation input key module which applies to portable devices and an operating method therefor provide a using status induction switch which is selectively activated actively or passively, and perform a corresponding operation according to the first using status induction or the second using status induction based on the using status induction switch. | 11-13-2008 |
20080278440 | Limb image interactive apparatus for information products - A limb image interactive apparatus for information products aims to directly maneuver varying image interfaces on a display interface of the information products through movements of user's hands and feet such as dancing, exercises or data entry without relying on conventional input interfaces such as joysticks, stepping pads and screen keyboards. Image interactions on the screen can be directly and interactively maneuvered easily through the movements of user's limbs such as hands and feet to provide a more humanized operation and control interface. | 11-13-2008 |
20080278441 | USER CONTROL IN A PLAYBACK MODE - A system comprises a display and logic coupled to the display. The logic causes a user control to automatically be displayed on the display upon a user activating an input device in both a full screen playback mode and a preview playback mode. | 11-13-2008 |
20080278442 | Control apparatus and method for input screens - A control apparatus for input screens includes a display unit, a switch portion and a control unit including a microcomputer. If one of a menu switch of the switch portion and a plurality of dummy switches included in a screen displayed by the display unit is operated, the microcomputer causes the display unit to display a new screen including a plurality of dummy switches. The microcomputer estimates a time period required for the operator to watch a screen to operate the dummy switch, depending on the displayed screen (the number of dummy switches). If the sum of estimated time periods exceeds a reference time period, the microcomputer nullifies operation of the dummy switch to prevent the screen from being switched. After the lapse of a predetermined time period, the microcomputer cancels the nullification of the operation of the dummy switch. | 11-13-2008 |
20080284723 | Physical User Interface - A physical user interface is provided as an adjunct to a graphical user interface to a device having an operating system. The physical interface has a work surface or workspace that is scanned by one or more sensors capable of determining the position of objects. The work surface or workspace is sub-divided into two or more regions. Each region is representative of a user-generated command. In some examples, the one or more sensors adapted to determine the position and orientation of one or more counters. The sensors can distinguish which region a counter is located in and what orientation it is in. The sensors provide an output signal, based on the determination, to the device. | 11-20-2008 |
20080284724 | Remote control systems that can distinguish stray light sources - Remote control systems that can distinguish predetermined light sources from stray light sources, e.g., environmental light sources and/or reflections are provided. The predetermined light sources can be disposed in asymmetric substantially linear or two-dimensional patterns. The predetermined light sources also can be configured to exhibit signature characteristics. The predetermined light sources also can output light at different signature wavelengths. The predetermined light sources also can emit light polarized in one or more predetermined polarization axes. Remote control systems of the present invention also can include methods for adjusting an allocation of predetermined light sources and/or the technique used to distinguish the predetermined light sources from the stray light sources. | 11-20-2008 |
20080284725 | FOOT-OPERATED KEY PAD - A foot operated data entry pad has a plurality of foot-operated buttons. The foot buttons are used to enter data values—e.g., numbers or symbols separately or in combination. Each button is capable of entering different data values, preferably depending on the length of time that it is pressed or on the number of times that it is pressed in succession. A small controller may be included to allow the user to control the computer's pointer, allowing the user to switch between data entry fields, as with a mouse. A heel rest may serve as both a heel rest and a button/switch for sending an electric/electronic signal. An automated voice system, or other audible and/or visual indicator system, may also be included to help the user keep track of the data value as it changes and is entered. Various embodiments are capable of entering a variety of alphanumeric data rather than a simple binary-type data set, such as yes/no or on/off, or instructions, such as a joystick used with a flight simulator program. Multiple data entry pads may optionally be used in conjunction. | 11-20-2008 |
20080284726 | System and Method for Sensory Based Media Control - An apparatus for sensory based media control is provided. A system that incorporates teachings of the present disclosure may include, for example, a media device having a controller element to receive from a media controller a first instruction to select an object in accordance with a physical handling of the media controller, and a second instruction to control the identified object or perform a search on the object in accordance with touchless finger movements. Additional embodiments are disclosed. | 11-20-2008 |
20080284727 | Input apparatus and information processing apparatus - An input apparatus connected to an information processing apparatus is provided, which includes an input unit to enter a key, a transmission unit to make itself recognized as a keyboard by an operating system (OS) on the information processing apparatus and transmit to the information processing apparatus a key code signal corresponding to the input from the input unit, a receiving unit to receive from the information processing apparatus a signal representing an electrically driven lighting state of the input apparatus, and a control unit to check the signal received from the information processing apparatus and representing the electrically driven lighting state and control transmission of the key code. | 11-20-2008 |
20080284728 | Method And System For Providing Input Mechanisms On A Handheld Electronic Device - The present invention is related to a system and method for providing input mechanisms in a handheld electronic device. The handheld electronic device includes a casing having a first surface and a second surface, where the second surface faces away from a user of the device when the first surface faces toward the user. An input control is arranged on the second surface. An input mechanism is configured to detect a user input event via the input control. A display is arranged on the first surface and configured to present a visual indication identifying the input control responsive to the user input event. | 11-20-2008 |
20080284729 | Three dimensional volumetric display input and output configurations - The present invention is a system that allows a number of 3D volumetric display or output configurations, such as dome, cubical and cylindrical volumetric displays, to interact with a number of different input configurations, such as a three-dimensional position sensing system having a volume sensing field, a planar position sensing system having a digitizing tablet, and a non-planar position sensing system having a sensing grid formed on a dome. The user interacts via the input configurations, such as by moving a digitizing stylus on the sensing grid formed on the dome enclosure surface. This interaction affects the content of the volumetric display by mapping positions and corresponding vectors of the stylus to a moving cursor within the 3D display space of the volumetric display that is offset from a tip of the stylus along the vector. | 11-20-2008 |
20080291156 | Sanitary User Interface - A user interface is configured to detect an attempt to touch a virtual button. The interface includes a first concave mirror facing a second concave mirror. The second concave mirror includes an aperture. A physical control button is arranged proximate to the first mirror and aligned such that an image of the control button in a form of a virtual button appears at the aperture. An attempt to touch the virtual button is detected, and feedback is generated at the virtual button in response to detecting the attempt to touch the virtual button by a user. | 11-27-2008 |
20080291157 | Multi-direction input device - A multi-direction input device for computers includes a directional wheel, a sliding member and a plurality of electrodes. The directional wheel has a rotational degree-of-freedom and two moving degree-of-freedom to be moved downwards, perform forward rolling, backward rolling and sideward moving thereby to connect corresponding electrodes to generate corresponding signals. In regular conditions the sliding member can be in cooperation with a movable contact to support the directional wheel. When the directional wheel is moved downwards the sliding member drives a corresponding electrode to generate a click signal. | 11-27-2008 |
20080291158 | CHARACTER INPUT DEVICE USING BIO RADAR UNIT AND TILT SENSOR - Disclosed herein is a character input device for a mobile device or a wearable terminal. A bio radar unit senses the positions of the fingers of a user. A tilt sensor senses the tilt of the hands of the user. A microprocessor calculates the final input information of the user by processing signals received from the bio radar unit and the tilt sensor. A wireless communication module transmits the final input information to the mobile device or the wearable terminal of the user. A speaker device outputs a feedback sound corresponding to the final input information of the user. The character input device is wearable on a wrist of the user. The bio radar unit transmits a signal, measures the distance between the character input device and a finger by measuring the strength of a reflected wave reflected from the finger with which the signal collides, and measures the angle of the finger related to activation. | 11-27-2008 |
20080291159 | Computer Input Device and Method for Operating the Same - An computer input device with context-awareness functionality generates a lighting pattern in response to a context of data inputting, audio, new message receiving, arranged event or system information in a computer system. | 11-27-2008 |
20080291160 | System and method for recognizing multi-axis gestures based on handheld controller accelerometer outputs - An example gesture recognition system and method recognizes a gesture made using a handheld control device comprising an accelerometer arrangement. The example system and method involve a database of example gesture inputs derived from accelerometer arrangement outputs generated by making respective gestures with the handheld control device. Corresponding components of a current gesture input and the example gesture inputs in the database are compared using root mean square calculations and the current input gesture is recognized/not recognized based on results of the comparing. | 11-27-2008 |
20080291161 | FORCE REFLECTING HAPTIC INTERFACE - A multi-function force reflecting haptic interface including various sub-assemblies is disclosed. The sub-assemblies include multiple function user interfaces, a user interface docking station for setting the interface to a home position, temperature monitoring and control systems, and various kinematic cable drive systems. | 11-27-2008 |
20080291162 | TRACK WHEEL WITH REDUCED SPACE REQUIREMENTS - An input generating device for use in a hand held electronic device having a housing includes a core formed in a planar and semicircular in shape, forming a peripheral edge extending around said core and a track slidably engaged with the peripheral side edge. A curved portion of the peripheral edge extends outwardly from the housing allowing access thereto by the user. A first input is generated by sliding movement of the flexible track relative to the core. First input detection means, such as a turns encoder switch detects the sliding movement of the track. The core is depressibly mounted within the housing, generating a second input when the core is depressed. A second input detection means, such as a tactile contact switch detects depression of the core. | 11-27-2008 |
20080291163 | 3D Pointing Devices with Orientation Compensation and Improved Usability - Systems and methods according to the present invention describe 3D pointing devices which enhance usability by transforming sensed motion data from a first frame of reference (e.g., the body of the 3D pointing device) into a second frame of reference (e.g., a user's frame of reference). One exemplary embodiment of the present invention removes effects associated with a tilt orientation in which the 3D pointing device is held by a user. | 11-27-2008 |
20080297470 | ELECTRONIC DOCUMENT READERS AND READING DEVICES - This invention generally relates to an electronic document readers and reading devices, that is to a device that presents a document to a user on a display to enable the user to read the document. An electronic document reading device configured for one hand operation, the device including: a housing; an electroactive display mounted in said housing; control electronics coupled to said display; at least one user control coupled to said control electronics for operating said device; and a rechargeable power source configured to power said control electronics and said display; and wherein said housing has a width to fit at least on the palm of an adult human hand, said width being less than approximately 120 mm, and wherein said housing has a length of at least twice said width, and wherein said control electronics and said rechargeable power source are disposed within said housing so as to provide a centre of mass of said device which, when a lower part of said device is held in a said palm, is located at a distance of no greater than 50% of said length from a lower end of said device. | 12-04-2008 |
20080297471 | GESTURE RECOGNITION METHOD AND TOUCH SYSTEM INCORPORATING THE SAME - A gesture recognition method includes detecting multiple pointers in close proximity to a touch surface to determine if the multiple pointers are being used to perform a known gesture. When the multiple pointers are being used to perform a known gesture, executing a command associated with the gesture. A touch system incorporating the gesture recognition method is also provided. | 12-04-2008 |
20080303782 | METHOD AND APPARATUS FOR HAPTIC ENABLED FLEXIBLE TOUCH SENSITIVE SURFACE - A method and apparatus for an electronic interactive device having a haptic enabled flexible touch sensitive surface are disclosed. In one embodiment, the electronic interactive device includes a flexible touch sensitive surface, a flexible screen (or display), and an actuator. The flexible touch sensitive surface is deposited over the flexible screen and is capable of receiving an input, such as, for example, from a user. The flexible screen displays an image via a displaying window. The actuator is coupled to the flexible screen and provides haptic feedback in response to the input. | 12-11-2008 |
20080303783 | Touchless detection display - A touchless detection display includes a transparent display, a display face, a light detector and light control material. The transparent display displays information to a viewer. The viewer views the information through the display face. The light detector detects incoming light that travels into the touchless detection display through the display face. The light control material receives incoming light that travels into the touchless detection display through the display face before the incoming light reaches the light detector. The light control material prevents portions of the incoming light that are not traveling substantially perpendicular to the display face from reaching the light detector. Locations of objects close to but not touching the display face are detected by the touchless detection display based on the incoming light detected by the light detector. | 12-11-2008 |
20080303784 | Information processing apparatus and computer-readable storage medium - An information processing apparatus includes a control portion and an IF portion. Haptic sense presentation devices are connected to the IF portion. The control portion calculates an area of an image object based on features of the image object and determines the calculated area of the image object as a virtual mass of the image object. The control portion calculates an acceleration of the image object based on the current and previous features of the image object. The control portion calculates a force to be presented to the haptic sense presentation device connected to IF portion based on the virtual mass and the acceleration of the image object and outputs a signal indicative of the calculated force to the haptic sense presentation device. | 12-11-2008 |
20080303785 | DISPLAY APPARATUS AND METHOD FOR NOTIFYING USER OF STATE OF EXTERNAL DEVICE - A display apparatus for displaying the state of an external device and a method thereof are provided. According to the present invention, messages indicating the connection state, the power state and the sleep mode of the external device are displayed on a screen of the display apparatus to which a USB is applied. Therefore, the state of an external device which inputs a video signal to the display apparatus, may be shown on the screen, and thus a user can easily know the state of the external device. | 12-11-2008 |
20080303786 | DISPLAY DEVICE - An object of the present invention is to achieve an advanced input operation without complicating image processing. A display device of the present invention includes a display unit, an optical input unit, and an image processor. The display unit displays an image on a display screen. The optical input unit captures an image of an object approaching the display screen. The image processor detects that the object comes into contact with the display screen on the basis of a captured image captured by the optical input unit, and then performs image processing to obtain the position coordinates of the object. In the display device, the image processor divides the captured image into a plurality of regions, and performs the image processing on each of the divided regions. | 12-11-2008 |
20080303787 | Touch Screen Apparatus And Methods - A touch screen computing apparatus, methods, and software product are provided. In one embodiment, an the computing apparatus provides a plurality of regions on a touch screen that are mapped to functions. In some embodiments, the regions include a keyboard region, a game control region, a mouse region and a stylus region. In some embodiments, the regions are configurable by the processor and mapped with different functionality. This Abstract is provided for the sole purpose of complying with the Abstract requirement rules that allow a reader to quickly ascertain the subject matter of the disclosure contained herein. This Abstract is submitted with the explicit understanding that it will not be used to interpret or to limit the scope or the meaning of the claims. | 12-11-2008 |
20080303788 | Input system and input apparatus - An input system executes an input to an information processing apparatus depending on the hand motion. Plural myoelectric sensors are provided on an area between a wrist of the person and bases of a second finger to a fifth finger, and detect myoelectric signals depending on the hand motion. A setting portion outputs at least one command to make the person execute at least one particular motion in a state where the myoelectric sensors are worn on the hand, and associates the detected myoelectric signals after the output of the command with the particular motion corresponding to the command. An input portion identifies the hand motion from myoelectric signals detected based on the hand motion after the termination of the association and the association result of the setting portion, and executes the input to the information processing apparatus depending on the identified hand motion. | 12-11-2008 |
20080303789 | Method and Apparatus for Compensating for Position Slip in Interface Devices - Method and apparatus for compensating for position slip in interface devices that may occur between a manipulandum and a sensor of the device due to a mechanical transmission. A device position delta is determined from a sensed position of a manipulandum of an interface device. It is determined if position slip has occurred caused by a change in position of the manipulandum that was not sensed by a sensor of the interface device, typically caused by a mechanical transmission between sensor and manipulandum. If position slip has occurred, an error in the sensed position caused by the position slip is corrected by adjusting the sensed position to take into account the position slip. The adjusted position delta is used as the position of the manipulandum and the display of objects controlled by the interface device are accordingly compensated. | 12-11-2008 |
20080309614 | USER INTERFACE WITH SOFTWARE LENSING FOR VERY LONG LISTS OF CONTENT - A user interface with software tensing may be described. An apparatus may include a user interface module to display an index list, a software lens list, and an aperture box. The index list may represent a list of available options. The software lens list may display a sub-set of the list of available options that coincides with a position of the aperture box on the index list. The apparatus may also include a media lensing module to increase a size of an option in the software lens list when a pointer approaches or coincides with the option. Other embodiments are described and claimed. | 12-18-2008 |
20080309615 | Storage medium storing information processing program and information processing device - An object has a plurality of joints | 12-18-2008 |
20080309616 | Alertness testing method and apparatus - A method and apparatus for detecting the alertness of an equipment operator by displaying a moving icon, and asking the operator to track the movements of the icon, either by following it with the eyes in a head mounted display, or by following it with a finger on a touch screen. The operator's performance can be measured by tracking the gaze of the operator's eyes, or by tracking the operator's finger movements. The performance of the operator can be compared to that particular person's history of test results, or to a data base of test results of other operators. The characteristics of the icon can be varied, and distractions can be provided on the display or screen. Control of the display or screen, tracking of the operator's eyes or finger, and analysis of the test results, can all be performed by a computer. | 12-18-2008 |
20080316169 | METHOD CIRCUIT AND SYSTEM FOR INTERFACING WITH AN ELECTRONIC DEVICE - According to some embodiments of the present invention, there is provided an interface apparatus for a multi-application electronic device, including a human interface surface having integrated presentation and sensing elements, such that the device has substantially full functionality for substantially all applications without the use of other human interfaces. | 12-25-2008 |
20080316170 | Computer Mouse - Provided is a computer mouse including a grip portion having a flat bottom surface and a longitudinal central grip axis; a grip central point located at the center of the longitudinal central grip axis; the grip portion standing on a surface on which the computer mouse moves; a vertical axis perpendicular to the surface and including the central point; a sensor portion which includes a sensor having a sensor central point located at the center of the sensor; the sensor being located distant in a forward direction from the grip portion; a vertical plane containing the vertical axis and the sensor central point; wherein the longitudinal central axis is angled from the vertical axis and tilted to the left or right with reference to the forward direction; a bottom grip point defined by the intersection between the vertical axis and the bottom surface; and wherein the bottom grip point is located substantially at the center of the bottom surface. | 12-25-2008 |
20090002314 | Tactile sense presentation device and tactile sense presentation method - A tactile sense presentation device is disclosed that drives a tactile sense unit to present a tactile sense to an operator. The device includes a location detection unit that detects the location of the tactile sense unit and a drive unit that drives the tactile sense unit with a thrust in accordance with the location detected by the location detection unit. The device controls the direction and the size of the thrust applied to the tactile sense unit in accordance with the location of the tactile sense unit detected by the location detection unit. | 01-01-2009 |
20090002315 | System And Method For Manipulation Of Sound Data Using Haptic Feedback - In an embodiment, a device which comprises means for generating an audio signal based on sound data, the audio signal configured to produce sound from an audio producing device; means for generating a haptic command based on the sound data, the haptic command configured to cause a haptic feedback device to output a haptic sensation, the haptic sensation being associated with at least one characteristic of the sound data; and means for receiving a navigation command from a user experiencing the haptic sensation via the haptic feedback device, the navigation command associated with the sound data and based, at least in part, on the haptic sensation. | 01-01-2009 |
20090002316 | MOBILE COMMUNICATION DEVICE WITH GAME APPLICATION FOR USE IN CONJUNCTION WITH A REMOTE MOBILE COMMUNICATION DEVICE AND METHODS FOR USE THEREWITH - A mobile communication device includes a processing module that executes a gaming application based on gaming data and that generates display data in response thereto, wherein the gaming data includes first data and second data. A sensor generates the first data in response to the actions of a user. At least one transceiver receives the second data from a remote communication device and that sends the display data to a display device in a gaming mode of operation. | 01-01-2009 |
20090009466 | Force-sensing orthotic electric device controller - The invention is directed to devices and methods for assisting individuals with poor extremity function (i.e., upper extremity function) to control devices. In one embodiment, a powered device is provided that comprises a controller and an orthosis communicably connected to the controller. An orthosis includes a harness worn by a user on a body part and a force sensing transducer positioned between the harness and the body part, wherein force applied to the transducer by the body part is communicated to the controller for controlling movement of the powered device. A method for controlling a device having a controller includes attaching a harness to a user's body part and applying a force by the body part onto a force sensing transducer positioned between the harness and the body part, wherein the force is communicated to the controller for controlling the device. | 01-08-2009 |
20090015547 | Electronic Device with Physical Alert - An electronic device ( | 01-15-2009 |
20090015548 | Image projection system - Provided is an image projection system including a screen, an input terminal, an image processing unit, an image projector, and invisible light ray-shielding member, characterized in that: the screen has a pattern-printed sheet having reflection patterns for transmitting positional information by reflecting invisible light rays or absorption patterns for transmitting positional information by absorbing invisible light rays; the input terminal has an invisible light ray-applying portion, detects a reflected light ray of an invisible light ray, which is applied from the invisible light ray-applying portion and reflected from a specific site of the pattern-printed sheet, reads positional information of any one of the reflection patterns or any one of the absorption patterns, and outputs the positional information to the image processing unit; the image processing unit converts the positional information input from the input terminal into image information A, and transfers the image information A to the image projector; the image projector converts the image information A transferred from the image processing unit into visible light rays, and projects the visible light rays on the screen; and the invisible light ray-shielding means is placed in front of or inside the image projector, and removes the invisible light ray from the visible light rays to be projected. The present invention can provide the image projection system in which, even when a screen is large, the positional information of the screen can be simply input in a non-contact fashion with high accuracy, and image information converted from the input positional information can be further converted into visible light rays to be projected. | 01-15-2009 |
20090015549 | Accepting User Input - The invention is directed to an improvement in mechanisms and techniques for accepting user input. An apparatus for accepting a user input is described comprising a display, and a plate and a control knob positioned over the display. A portion of the display is visible through the plate, and a portion of the control knob is optically transparent such that information displayed by the display is visible through the control knob. In some examples, the control knob functions as both a rotary input and as a push button input. The control knob may function as a push button input through a transfer of force through the plate to a pressure sensing switch associated with the plate. The control knob may function as a rotary control through drive gears, belts, or interaction with a light emitter and detector. | 01-15-2009 |
20090021473 | Haptic Communication Devices - Embodiments of the invention relate to methods and systems for providing customized “haptic messaging” to users of handheld communication devices in a variety of applications. In one embodiment, a method of using haptic effects to relate location information includes: receiving an input signal associated with a position of a handheld communication device ( | 01-22-2009 |
20090021474 | SYSTEM AND METHOD FOR DISPLAYING STATUS INFORMATION OF A MULTIMEDIA BROADCAST RECEIVER ON AN AMBIENT DEVICE - There is provided a system for providing status information associated with viewing behavior of media broadcasting. The system comprises a client device that includes a receiver to receive presence data from a remote device and a processor to generate an ambient command based on the presence data. The presence data is associated with broadcast programs of the remote device, and the ambient command represents viewing information of the broadcast program at the remote device. The system also comprises an ambient component that provides an ambient representation of the viewing information based on the ambient command. The ambient component may be an integral part of the client device or a separate component that communicates with the client device via wired or wireless connection. | 01-22-2009 |
20090021475 | METHOD FOR DISPLAYING AND/OR PROCESSING IMAGE DATA OF MEDICAL ORIGIN USING GESTURE RECOGNITION - A method for processing and/or displaying medical image data sets in or on a display device having a screen with a surface, including: detecting gestures performed on or in front of the screen surface; correlating the gestures to predetermined instructional inputs; and manipulating, generating, or retrieving, via computer support, the medical image data sets in response to the instructional inputs. | 01-22-2009 |
20090021476 | INTEGRATED MEDICAL DISPLAY SYSTEM - The invention relates to a medical display system for performing a medical function, including: an image display unit configured for displaying a medical image data set, and an additional device, wherein the additional device is integrated with said image display unit and is configured to assist in performing the medical function of the medical display system. | 01-22-2009 |
20090021477 | METHOD FOR MANAGING INFORMATION - The invention relates to a method of transferring data from a drawing device, which while utilizing a position-coding pattern, printed on a physical page, digitally records handwritten information, to an application in a computer system. The drawing device transfers recorded data to a memory in the computer system. A registering unit in the system determines from a page from which the recorded data originates and activates, on the basis thereof, one or more applications which are registered as “subscribers” to data from this page. When an application is activated and thus informed of the existence of new data relevant to the application, the application fetches this data. The fetching of data can be made on the basis of the contents of a page description which defines the layout of the physical page. | 01-22-2009 |
20090027330 | DEVICE FOR USING VIRTUAL MOUSE AND GAMING MACHINE - A virtual mouse device is preferably installed in a gaming machine, and serving as an input device. An image sensor unit is laminated on a specific area on a screen of a display unit, and detects fingers or a palm of a player that move on or over the specific area. The virtual mouse controller unit monitors the fingers or palm, and causes a virtual mouse to follow the fingers or palm within the specific area. If the fingers or palm moves out of the specific area, the virtual mouse controller unit then returns the virtual mouse to a default location. An input unit monitors the motion of the virtual mouse, and causes the display unit to move a mouse pointer on the screen depending on the amount and direction of travel of the virtual mouse. | 01-29-2009 |
20090027331 | METHOD AND APPARATUS FOR DISPLAYING STATE OF APPARATUS - A method of displaying a state of an apparatus having a user interface which includes generating state display information indicating the state of the apparatus and displaying the generated state display information in the user interface. The state display information indicates the state of the apparatus through a metaphorical indicator. | 01-29-2009 |
20090027332 | MOTOR VEHICLE COCKPIT - A motor vehicle cockpit display system includes display units for displaying information arranged at different positions in a passenger compartment of the motor vehicle. A control arrangement controls the content of the information which is respectively displayed on the display units. A recording device senses in a contactless fashion an assignment of a user's limb to a first display unit and a gesture-dependent change in the assignment to a further display unit. Information relating to the change in the assignment is fed to the control arrangement, and the further display unit can be actuated to display the information of the first display unit by the control arrangement in accordance with the change in the assignment. | 01-29-2009 |
20090033617 | Haptic User Interface - It is presented a method comprising: generating at least one haptic user interface component using an array of haptic elements; detecting user input applied to at least one haptic element associated with one of said at least one haptic user interface component; and executing software code associated with activation of said one of said at least one user interface component. A corresponding apparatus, computer program product and user interface are also presented. | 02-05-2009 |
20090033618 | Unit, an Assembly and a Method for Controlling in a Dynamic Egocentric Interactive Space - A portable unit for providing instructions for navigation in menus or controlling equipment, the unit having a user interface and a camera pointing in the general direction of the user. The unit tracking relative movements between the unit and the user and converting the relative movement into the instructions. The unit may be used as a remote control for audio or video equipment or computers or the like. | 02-05-2009 |
20090033619 | METHOD AND APPARATUS FOR CONTROLLING UNIVERSAL PLUG AND PLAY DEVICE TO REPRODUCE CONTENT IN A PLURALITY OF REPRODUCTION REGIONS ON SCREEN THEREOF - Provided are a method of reproducing content using a universal plug and play (UPnP) device which has a plurality of reproduction regions. According to the method, information regarding reproduction regions is obtained using newly defined actions before a control point calls an action “Play( ).” Then, a reproduction region for a piece of content is designated according to a user's input. Accordingly, the user can simultaneously enjoy a plurality of pieces of content in desired reproduction regions on a screen of a UPnP device. | 02-05-2009 |
20090040175 | INPUT INTERFACE DEVICE WITH TRANSFORMABLE FORM FACTOR - Various implementations of an interface device, along with associated methods and systems, are described in which the interface device has a housing with a transformable form factor, and a transformation assembly that can change the form factor of the housing. At least one of the form factors of the housing has a shape that corresponds to data associated with the interface device. | 02-12-2009 |
20090046054 | Resistive Actuator With Dynamic Variations Of Frictional Forces - A system for generating haptic effects on a rotary knob includes an electrical coil and a core. A first level of voltage is applied to the coil to enable a first surface interface having a first coefficient of friction and to generate a first haptic effect by varying the voltage. A second level of voltage is applied to the coil to enable a second surface interface having a second coefficient of friction that is greater than the first coefficient of friction and to generate a second haptic effect by varying the voltage. | 02-19-2009 |
20090046055 | Interactive digital image display - An interactive digital image display has a screen, an image processor, a coordinate detecting unit coupled to the image processor, an image driving unit coupled to the image processor, a memory coupled to the image processor, and an interface for acquiring images. When user moves the display, the coordinate detecting unit generates a signal to the image processor, and the image displayed on the display will be adjusted according to the displacement of the display | 02-19-2009 |
20090046056 | Human motion tracking device - A “human motion tracking” device (HMT) that translates natural body movements into computer-usable data. The data is transmitted to a simulation application as if it came from any conventional human interface device (HID). This allows an individual to interact with the application without the need for a conventional computer input device (e.g., a keyboard, mouse, etc.). The HMT captures a user's heading as well as the individual's current stance (i.e. standing, sitting, kneeling, etc.). The HMT accomplishes this by using two sensors known as an accelerometer and magnetometer to produce digital input. The digital data will then be passed from the HMT to the application. | 02-19-2009 |
20090046057 | IMAGE FORMING APPARATUS, DISPLAY PROCESSING APPARATUS, DISPLAY PROCESSING METHOD, AND COMPUTER PROGRAM PRODUCT - An MFP includes a selection receiving unit that receives selection by a user of a desired one of higher setting items displayed on an operation panel, and a display processing unit that displays, when the selection of the higher setting item is received, intermediate setting items corresponding to the selected higher setting item and overview information indicating overviews of lower setting items corresponding to the intermediate setting items, being associated with each other, on the operation panel. | 02-19-2009 |
20090046058 | SELF-CONTAINED, POCKET-SIZED PRESENTATION APPARATUS - A self-contained, pocket-sized presentation apparatus includes a USB drive having a housing, a memory, a processor, and protective cover. A user input device, wireless transmitter, and power source, are integrally disposed within the cover, the transmitter being operatively engaged with the input device and configured to selectively transmit wireless signals in response to selective user actuation of the input device. A wireless receiver is disposed within the housing of the USB drive, to receive and couple wireless signals from the transmitter to the USB drive. The USB drive is configured to receive the wireless signals from the wireless receiver, to selectively generate Page Up and Page Down instructions responsive thereto, and to send the Page Up and Page Down instructions via the USB plug. The memory is configured to contain computer readable program code therein, in the form of a presentation, and in the form of a portable presentation application. | 02-19-2009 |
20090046059 | FINGER POINTING APPARATUS - A finger pointing apparatus is disclosed. In one aspect the finger pointing apparatus includes at least one pressure sensor fixed on a hand for triggering a corresponding electromagnetic wave transmitter to transmit electromagnetic wave when pressure is produced by a finger contacting an external object. In another aspect, the finger pointing apparatus includes at least one electromagnetic wave transmitter connected with a corresponding pressure sensor and fixed on the hand for transmitting electromagnetic wave to all electromagnetic wave receivers when pressure is detected by the pressure sensor. In one aspect, the finger pointing apparatus includes at least two electromagnetic wave receivers arranged at fixed positions with respect to each other for receiving electromagnetic wave from the at least one electromagnetic wave transmitter and transmitting received electromagnetic wave to a microprocessor. In another aspect, the finger pointing apparatus includes a microprocessor for receiving electromagnetic wave from the electromagnetic wave receivers, calculating coordinate values of a position pressed by the finger from electromagnetic wave from different electromagnetic wave receivers and outputting the coordinate values. | 02-19-2009 |
20090051647 | PORTABLE ELECTRONIC DEVICE WITH MOTION SENSING MODULE - A portable electronic device includes a display module for displaying a menu, a motion sensing module, and a controlling module. The menu having a plurality of menu options. The motion sensing module detects a motion of the portable electronic device imparted by a user and generates a trigger signal associated with the motion of the portable electronic device. The controlling module controls choosing and/or executing a menu option in response to the trigger signal. | 02-26-2009 |
20090051648 | GESTURE-BASED MOBILE INTERACTION - Gesture-based mobile interaction, motion of a device is sensed using image data, and a gesture corresponding to the sensed motion of the device is recognized. Functionality of the device corresponding to the recognized gesture is determined and invoked. | 02-26-2009 |
20090051649 | Wearable display interface client - A wearable display interface that during operation is in short-range, wireless bidirectional communication with a server, related methods, and systems including the interface and server. The wearable interface includes a power supply driving a processor that is operatively coupled to a visual display and a wireless interface where the wearable display interface further includes an attachment interface for linking the display interface to a user's body, such as wrist or neck, or apparel. The server is preferably a mobile telephone where the display interface is a client. In addition, the interface can further function as a speaker and/or microphone to received and/or accept verbalized data, such as input instructions or communication commands. Intentional distribution of processing across the server and client ensures that the interface maintains a small and efficient form factor. | 02-26-2009 |
20090058799 | INTERACTIVE POINTING DEVICE - An interactive pointing device having pointing function in space and game control is provided in the present invention. The interactive pointing device comprises an accelerometer module and a gyroscope device. The accelerometer module functions as sensing the movement of the operator and generates at least one axis of accelerating signal corresponding to the sensed movement. The gyroscope device disposed on a turning mechanism functions as sensing rotation status of the interactive pointing device about at least one axis and generate a corresponding rotating signal. The turning mechanism can be operated to adjust the axis of the gyroscope device so that the gyroscope device is capable of sensing rotation status about different axes. The at least one accelerating signal and the rotating signal are then processed for controlling cursor movement of the electrical device and interacting with multimedia gaming programs. | 03-05-2009 |
20090058800 | INFORMATION PROCESSING DEVICE, PROGRAM, AND METHOD - According to one embodiment, an information processing device includes a position detecting section configured to detect a position of a hand from an input image of the hand, a memory section configured to store data of the position of the hand detected by the position detecting section, a rotation judging section configured to judge, assuming that records of the data of the position of the hand stored in the memory section show a rotary movement, that a latest position of the hand falls in an angle range predicted for the rotary movement, and an executing section configured to, when the rotation judging section judges that the latest position of the hand falls in the angle range, obtain a rotational angle at the latest position of the hand and also execute a process that corresponds to a predetermined rotary movement of the hand. | 03-05-2009 |
20090066637 | HANDHELD ELECTRONIC DEVICE WITH MOTION-CONTROLLED DISPLAY - A handheld electronic device includes a display, a memory configured to store a map, and a motion sensor configured to monitor the movement of the handheld electronic device. A controller is coupled to the display, the memory, and the motion sensor. The controller is configured to generate an image on the display representative of a portion of the map, the image having a field of view (FOV). The controller is also configured to adjust the FOV of the image based upon the movement of the handheld electronic device as detected by the motion sensor. | 03-12-2009 |
20090066638 | Association of virtual controls with physical controls - A media application for providing outputs (e.g., audio outputs) in response to inputs received from an input device is provided. The media application may connect input mechanisms of an input device with parameters of channel strips (e.g., which may define output sounds) using an intermediate screen object. The media application may first assign an input mechanism to a screen object, and separately map a screen object to a channel strip parameter. The media application may map a screen object to several channel strips simultaneously such that, based on the value of the screen object, the volume of each of the several channel strips changes. The media application may provide a graphical representation of available channel strips using layers. As the media application accesses a channel strip, the appearance of the portion of the layer associated with the channel strip may change. The media application may also allow the patches, which may include several channel strips, to survive after a new patch is selected instead. | 03-12-2009 |
20090066639 | VISUAL RESPONSES TO A PHYSICAL INPUT IN A MEDIA APPLICATION - A media application for providing outputs (e.g., audio outputs) in response to inputs received from an input device is provided. The media application may connect input mechanisms of an input device with parameters of channel strips (e.g., which may define output sounds) using an intermediate screen object. The media application may first assign an input mechanism to a screen object, and separately map a screen object to a channel strip parameter. The media application may map a screen object to several channel strips simultaneously such that, based on the value of the screen object, the volume of each of the several channel strips changes. The media application may provide a graphical representation of available channel strips using layers. As the media application accesses a channel strip, the appearance of the portion of the layer associated with the channel strip may change. The media application may also allow the patches, which may include several channel strips, to survive after a new patch is selected instead. | 03-12-2009 |
20090066640 | DEVICE AND METHOD FOR PROVIDING A USER INTERFACE - In an information processing apparatus, an operation receiving section receives an operation performed on a display. In response to the operation to call up a setting screen, a menu information generating section generates menu information in which a setting button requesting a setting conflicting with the current state is displayed in a highlighted mode. In response to the operation of pressing the highlighted setting button, an influence information generating section generates influence information on the current state conflicting with the setting to be made by pressing the setting button. In addition, a display control section controls the display of a setting screen, including the menu information, and also controls the display of a setting screen, including the influence information when the influence information is generated. | 03-12-2009 |
20090066641 | Methods and Systems for Interpretation and Processing of Data Streams - Methods and systems for interpreting and processing data streams from a plurality of sensors on a motion-capture device are described. In various embodiments, an engine module of the system receives a raw input data stream comprising motion and non-motion data. Metadata is associated with data segments within the input data stream to produce a stream of data profiles. In various embodiments, an interpreter converts received data profiles into non-contextual tokens and/or commands recognizable by an application adapted for external control. In various embodiments, a parser converts received non-contextual tokens into contextual tokens and/or commands recognizable by an application adapted for external control. In various embodiments, the system produces commands based upon the non-contextual and/or contextual tokens and provides the commands to the application. The application can be a video game, software operating on a computer, or a remote-controlled apparatus. In various aspects, the methods and systems transform motions and operation of a motion-capture device into useful commands which control an application adapted for external control. | 03-12-2009 |
20090073112 | METHOD AND SYSTEM FOR DYNAMICALLY CONFIGURABLE TACTILE FEEDBACK FOR NAVIGATIONAL SUPPORT - A method for providing dynamically configurable tactile indicator signals on a control surface for navigational support, includes: detecting at least one of an operator's hands positioned on a control surface; routing control signals to a series of tactile sensors in proximity to the detected positions of at least one of the operator's hands; wherein the control signals actuate tactile feedback devices; wherein the control signals are based on navigational information; providing tactile feedback to the operator via the actuated tactile feedback devices; wherein the tactile feedback is dynamically configured in response to the number of operator hands detected on the control surface; and wherein the tactile feedback is dynamically configured in response to the position of at least one of the operators hands on the control surface. | 03-19-2009 |
20090073113 | Presenter model - A presenter model includes a presenter having a wireless signal transmitting unit and an accommodating box having a wireless signal receiving unit. The accommodating box having the wireless signal receiving unit receives the wireless signals transmitted by the wireless signal transmitting unit of the presenter. Via a USB port connected with an electronic apparatus, the command received is transmitted to the electronic apparatus, thereby achieving the function of transmitting the signals of the presenter. The accommodating box can be provided thereon with at least one insertion slot for a USB interface and at least one insertion slot for an IEEE1394 interface, thereby serving as digital expansion slots. The accommodating box allows the presenter to be disposed therein and combined therewith, thereby increasing the convenience in carrying and storing the presenter. | 03-19-2009 |
20090073114 | Control of a scrollable context menu - Disclosed are a method, a system and a navigation device for generating and controlling an interaction object, which is preferably in the form of a context menu, on a display unit. In at least one embodiment, the method includes presentation of the interaction object by way of at least one presentation signal from the navigation device and selection of at least one functional element from the presented interaction object by way of at least one selection signal from the navigation device, wherein the selection can be made independently of a movement by the navigation device and wherein the at least one functional element to be selected and/or the selected at least one functional element is presented at a constant location on the display unit by moving within the interaction object or by moving the interaction object. | 03-19-2009 |
20090079690 | METHOD AND APPARATUS FOR ENHANCING ENTERTAINMENT SOFTWARE THROUGH HAPTIC INSERTION - A method for enhancing entertainment through haptic insertion includes monitoring signal(s) during the execution of entertainment software, recognizing that the monitored signal(s) satisfy predetermined criteria, and generating a haptic control signal in response to enhance an entertainment experience. Monitored signals may include, for example, audio signals, video signals, data signals, control signals, and the like. Entertainment software may include, for example, a video game, an audio-visual work, an audio work, and the like. A device for enhancing entertainment software through haptic insertion includes at least one processors and an output unit coupled to the processor(s) and including a haptic control output. The processor(s) are configured to monitor at least one signal during the execution of entertainment software, to recognize that the monitored signal(s) satisfy a predetermined criterion, to generate a haptic control signal in response to such recognition, and to output the generated haptic control signal through the haptic control output of the output unit. | 03-26-2009 |
20090085863 | MOTION BASED DISPLAY MANAGEMENT - A display manager is configured to handle the drawing of windows on one or more displays for an application differently based on detected motion information that is associated with a device. The display manager may not display windows for some applications while motion is detected, while the display manager may display windows for other applications even when motion is detected. Motion enabled applications may interact with the display manager and motion information to determine how to display windows while motion is detected. | 04-02-2009 |
20090085864 | METHOD AND SYSTEM FOR GESTURE CLASSIFICATION - The present invention is a method of identifying a user's gestures for use in an interactive game application. Videocamera images of the user are obtained, and feature point locations of a user's body are identified in the images. A similarity measure is used to compare the feature point locations in the images with a library of gestures. The gesture in the library corresponding to the largest calculated similarity measure which is greater than a threshold value of the gesture is identified as the user's gesture. The identified gesture may be integrated into the user's movements within a virtual gaming environment, and visual feedback is provided to the user. | 04-02-2009 |
20090085865 | DEVICE FOR UNDERWATER USE AND METHOD OF CONTROLLING SAME - A method of controlling a device for underwater use includes detecting a user-interaction with the electronic device based on signals received from an accelerometer, and performing an operation as a result of the user-interaction. | 04-02-2009 |
20090085866 | Image display apparatus - An image display apparatus includes an input unit which has a display on which an image is displayed, a flexible sheet typed substrate, and a bending detection section which is arranged on a surface of the substrate to detect a bending deformation of the substrate, and a display control section which controls to change an image to be displayed on the display, based on the bending deformation of the substrate detected by the bending detection section. Accordingly, it is possible to provide an image display apparatus which is capable of changing easily an image to be displayed on the display even when a user is not good at operating an equipment. | 04-02-2009 |
20090091529 | Rendering Display Content On A Floor Surface Of A Surface Computer - Methods, apparatus, and products are disclosed for rendering display content on a floor surface of a surface computer, the floor surface comprised in the surface computer, the surface computer capable of receiving multi-touch input through the floor surface and rendering display output on the floor surface, that include: detecting, by the surface computer, contact between a user and the floor surface; identifying, by the surface computer, user characteristics for the user in dependence upon the detected contact; selecting, by the surface computer, display content in dependence upon the user characteristics; and rendering, by the surface computer, the selected display content on the floor surface. | 04-09-2009 |
20090091530 | SYSTEM FOR INPUT TO INFORMATION PROCESSING DEVICE - New input systems, that is, a paper icon, a paper controller, a paper keyboard, and a mouse pad capable of inputting letters, characters or the like to a computer and performing operations with easy manipulation and replacing hardware devices such as a keyboard, a mouse, and a tablet are provided. By providing an icon formed on a medium for reading a dot pattern formed on a surface of the medium using a scanner connected to an information processing device, for converting the dot pattern into each of or one of a code value and a coordinate value defined by the dot pattern, and for outputting a voice, an image, a moving image, a letter or character or a program corresponding to each of or one of the code value and the coordinate value stored in the information processing device or for outputting information on an access to a website corresponding to each of or one of the code value and the coordinate value stored in the information processing device, it is possible to realize information on the voice, image, moving image or letter or character prepared in advance, start of the program, access to the website or the like. | 04-09-2009 |
20090096746 | Method and Apparatus for Wearable Remote Interface Device - A method and apparatus of using a wearable remote interface device capable of detecting inputs from movements are disclosed. The wearable remote interface device, which could be attached to a finger or a hand or any parts of a body, includes a sensor, a filter, an input identifier, and a transmitter. The sensor, in one embodiment, is capable of sensing the movement of the finger or any part of body in which the wearable remote interface device is attached with. Upon detecting the various movements associated with the finger, the filter subsequently removes any extraneous gestures from the detected movements. The input identifier, which could be a part of the filter, identifies one or more user inputs from the filtered movements. The transmitter transmits the input(s) to a processing device via a wireless communications network. | 04-16-2009 |
20090096747 | Method of rolling picture using input device - A method of rolling picture by using input device is disclosed. The input device has a housing and a rotatable component relative to the housing, and through rotating the component, an instruction signal for rolling picture being produced. The method includes steps of setting picture rolling, wherein the input device is set to have at least a first mode or a second mode for rolling the picture, in which each mode is set to have a picture rolling displacement which is corresponding to the driven picture rolling by the single instruction signal every time, and different modes have different displacements; and deciding the mode, wherein a standard value which is compared with the number of instruction signal generated per unit time for deciding the mode of the instruction signal is provided, and through the standard value, the instruction signal is decided to enter the first mode or the second mode. | 04-16-2009 |
20090102785 | Six-Direction Button - The six-direction button includes a base, a rotation detector having a rotation pin disposed on the base, a button cover coupled with the rotation pin, and four pressure detectors disposed on the base under the button cover. The button cover and the rotation pin may be rotated clockwise or counterclockwise. The pressure detectors are arranged according to an up side, a down side, a left side, and a right side of the button. The cover button cover may touch one of the four pressure detectors when the button cover is pressed. | 04-23-2009 |
20090102786 | METHOD FOR TESTING AND PAIRING WIRELESS PERIPHERAL DEVICE - The present invention relates to a method for testing and pairing a wireless peripheral device. Different communication channels and different identification codes are used for testing and pairing the wireless peripheral device. As a result, the erroneous pairing relation the wireless input device and the wireless transceiver of the wireless peripheral device is avoided. | 04-23-2009 |
20090109173 | Multi-function computer pointing device - A first aspect of the present invention includes an N-persistent-mode pointing device and a 2N-mixed-mode pointing device. A second aspect of the present invention is an unconventional method for generating target-motion signals with common motion controls. This method utilizes a qualification-rule-based dynamic mapping of the present invention. A third aspect of the present invention combines the multi-mode designs and the dynamic mapping. | 04-30-2009 |
20090109174 | Method and Apparatus for User Interface in Electronic Devices With Visual Display Units - A system for a 3-D user interface comprises: one or more 3-D projectors configured to display an image of all or one or more parts of a first electronic device in a 3-D coordinate system; one or more sensors configured to sense user interaction with the image and to provide user interaction information; and a processor configured (i) to receive the user interaction information from the one or more sensors; (ii) to correlate the user interaction with the image; and (iii) to operate a second electronic device in a manner responsive to a correlation of the user interaction with the image, wherein the first electronic device has a visual display unit and the image comprises an image of all or one or more parts of the visual display unit of the first electronic device. A method for providing a 3-D user interface comprises: generating an image of all or one or more parts of a first electronic device in a 3-D coordinate system; sensing user interaction with the image; correlating the user interaction with the image; and operating a second electronic device in a manner responsive to a correlation of the user interaction with the image, wherein the first electronic device has a visual display unit and the image comprises an image of all or one or more parts of the visual display unit of the first electronic device. Computer readable program codes related to the system and the method of the present invention are also described herein. | 04-30-2009 |
20090109175 | METHOD AND APPARATUS FOR USER INTERFACE OF INPUT DEVICES - A system for a 3 dimensional (3-D) user interface comprises: one or more 3-D projectors configured to display an image at a first location in a 3-D coordinate system; one or more sensors configured to sense user interaction with the image and to provide user interaction information; and a processor configured (i) to receive the user interaction information from the one or more sensors; (ii) to correlate the user interaction with the image; and (iii) to provide one or more indications responsive to a correlation of the user interaction with the image, wherein the one or more indications comprise displaying the image at a second location in the 3-D coordinate system. A method for providing a 3-D user interface comprises: generating an image at a first location in a 3-D coordinate system; sensing user interaction with the image; correlating the user interaction with the image; and providing one or more indications responsive to a correlation of the user interaction with the image, wherein the one or more indications comprise displaying the image at a second location in the 3-D coordinate system. Computer readable program codes related to the system and the method of the present invention are also described herein. | 04-30-2009 |
20090115721 | Gesture Recognition Light and Video Image Projector - A system and method is provided for a gesture recognition interface system. The system comprises a projector configured to project colorless light and visible images onto a background surface. The projection of the colorless light can be interleaved with the projection of the visible images. The system also comprises at least one camera configured to receive a plurality of images based on a reflected light contrast difference between the background surface and a sensorless input object during projection of the colorless light. The system further comprises a controller configured to determine a given input gesture based on changes in relative locations of the sensorless input object in the plurality of images, and being further configured to initiate a device input associated with the given input gesture. | 05-07-2009 |
20090122006 | ENHANCED PROTOCOL AND ARCHITECTURE FOR LOW BANDWIDTH FORCE FEEDBACK GAME CONTROLLER - Haptic features are stored in a haptic device by preloading or otherwise downloading them, e.g., wirelessly, into the haptic device at the time of manufacture, immediately prior to game play, during game play, and/or at any other time. Haptic features may be activated, deactivated, modified or replaced at any time. All or a subset of the haptic features may be selected as an active play list, which may be modified as necessary. A host may manage some or all device memory and the haptic features stored therein. Haptic features stored in haptic devices and control information provided by the host are used by the haptic device to execute haptic effects. The haptic device may sustain haptic effects between control messages from the host. New communication messages may be added to an underlying communication protocol to support haptic effects. New messages may use header portions of communication packets as payload portions. | 05-14-2009 |
20090122007 | INPUT DEVICE, CONTROL METHOD OF INPUT DEVICE, AND PROGRAM - Disclosed herein is an input device including, a target creating section a performing section, and a height information generating section wherein the target creating section generates the information on the target to which information the height information is added, and the performing section performs predetermined processing on a basis of the height information added to the information on the target. | 05-14-2009 |
20090128480 | Device for connecting an electronic unit to screens of different types without distinction, and a corresponding screen - The invention relates to a device for connecting an electronic unit ( | 05-21-2009 |
20090128481 | INTEGRATED SYSTEM WITH COMPUTING AND IMAGING CAPABILITES - An integrated system comprising both imaging and computing capabilities comprises a light valve and a CPU, as well as other functional members for performing computing and imaging. | 05-21-2009 |
20090128482 | Approach for offset motion-based control of a computer - A system for controlling a computing device. The system includes, a plurality of sensed locations corresponding to a sensed object, a sensing apparatus to sense a position of the sensed locations relative to the sensing apparatus, and a motion control engine executable on a computing device, in response to the motion control engine receiving position data indicative of the position of the sensed locations from the sensing apparatus, the motion control engine to generate an adjusted position based on the position data, wherein the adjusted position is offset from the position of the sensed locations, and wherein the adjusted position is fixed relative to the position of the sensed locations. | 05-21-2009 |
20090128483 | ADVANCED NAVIGATION TECHNIQUES FOR PORTABLE DEVICES - The present invention provides a unique system and method that facilitates navigating smoothly and gracefully through any type of content viewable on portable devices such as cell-phones, PDAs, and/or any other hybrids thereof. In addition, such navigation can be performed while preserving perspective and context with respect to a larger amount of content. Pointing devices can also be used to navigate through content—the amount or detail of the content being dependant on the speed of the pointing device. Additionally, a semi-transparent overview of content can be overlaid a zoomed-in portion of content to provide perspective to the zoomed in portion. Content shown in the semi-transparent overview can depend on the location of the pointing device with respect to the content. | 05-21-2009 |
20090128484 | INFORMATION PROCESSING APPARATUS, SCROLL CONTROL APPARATUS, SCROLL CONTROL METHOD, AND COMPUTER PROGRAM PRODUCT - According to one embodiment, an information processing apparatus including a display unit and an inclination detection unit detecting an inclination of its main body includes a direction instruction unit, a reference inclination storage unit, an inclination difference output unit, and a scroll unit. The direction instruction unit instructs a scroll direction in which a display range of the display unit is to be moved. The reference inclination storage unit stores, as a reference inclination, an inclination in the scroll direction among inclinations detected by the inclination detection unit. The inclination difference output unit outputs a difference between the reference inclination and an inclination in the scroll direction. The scroll unit moves the display range on the display unit according to the difference outputted from the inclination difference output unit. | 05-21-2009 |
20090135132 | Electronic display system to show large size visual content on a small size display device - A portable display system for an electronic device is invented, which consists of a portable display and a pair of eye glasses. The size of the display is small for easy carrying, for example, the display can fit in to a hand. The resolution of the display is changeable, for example, it can be lower than one hundred by one hundred for normal cell phone usage. It can be as high as thousands by thousands to show HDTV, movie, and web browsing. When the display is in its low resolution mode, naked eyes can see the display content without problem. When the display is in its high resolution mode, the pair of glasses, which enlarges the display content a few times, is needed. There may be, depending of application, multiple resolution modes between the lowest and the highest. | 05-28-2009 |
20090135133 | 3D Motion Control System and Method - There is disclosed a control system which may included a seat having a pelvis angle sensor to provide positional information indicative of an operator's pelvis angle in a frontal plane. The positional information provided by the pelvis angle sensor may be used to control, at least in part, a device. | 05-28-2009 |
20090135134 | EDUCATION METHOD AND SYSTEM INCLUDING AT LEAST ONE USER INTERFACE - An education method and system including a user interface. The user interface may illustrate a portion of the human anatomy that may be viewed in three dimensions. The user interface and method may also provide for selecting training modules, inputting student names, access codes, and tracking pre- and/or post-training student knowledge via quizzes and tests, receive curriculum concurrence from school or government administrators, receive permission from parents or guardians to the actual training and/or provide general visiting users with a holistic overview of the user interface and method of providing the education. | 05-28-2009 |
20090135135 | RECORDING AND REPRODUCING APPARATUS - A recording and reproducing apparatus includes: a recording means for storing a plurality of images in groups; a display means for displaying images stored in the recording means; a detecting means for detecting a part of a human body or an object in a predetermined form; and a display switching means for switching images to be displayed on the display means in accordance with a form of a part of a human body or a form of an object detected by the detecting means. | 05-28-2009 |
20090140977 | Common User Interface Structure - A common user interface structure is described. In embodiment(s), a common user interface structure includes proportional geometry variables that can be adjusted such that the common user interface structure is scaled for display on media devices that each have different sized display screens. The common user interface structure includes a dimension control variable from which the proportional geometry variables are derived to scale the common user interface structure for display. The common user interface structure can also include menu item regions that include selectable content links to initiate rendering media content, and the menu item regions are scaled for display in the common user interface structure when the proportional geometry variables are adjusted. | 06-04-2009 |
20090146947 | UNIVERSAL WEARABLE INPUT AND AUTHENTICATION DEVICE - The object of the wearable input device is to provide the user with one data input device and authentication system that is portable and can be worn like a fashion accessory, such as a watch or bracelet, so as to be unobtrusive to daily activity. The wearable input device can be used to replace home and car lock and security systems, television/VCR/DVD remote controls, personal computer authentication system, credit card authentication systems, automatic teller machine authentication systems, among others. | 06-11-2009 |
20090146948 | Apparatus for Providing Sensing Information - The invention relates to an apparatus for providing sensing information which includes a surface-sensing information combiner for collecting tactile sensing information on a surface of an object and accompanying sensing information attendant on the tactile sensing information of the surface of the object to produce surface-sensing information or edit the produced surface-sensing information. The apparatus also includes a surface-sensing information board for providing an environment for the surface-sensing information of the object to allow a user to perceive the surface-sensing information of the object. The apparatus further includes a surface-sensing information reproducer for reproducing the tactile sensing information and the accompanying sensing information of the object to be sensed by the user. | 06-11-2009 |
20090153466 | Method and System for Optimizing Scrolling and Selection Activity - Described are a method, a device, and a system for activating and deactivating a scrolling operation. The method includes receiving an input signal from an input interface on a mobile unit (“MU”), activating a scrolling operation of a display of the MU, sensing at least one of a motion and an orientation of the MU, and scrolling the display of the MU based on the one of the sensed motion and the sensed orientation of the MU. The device includes a display, an input interface for receiving an input signal, at least one sensor for sensing at least one of an orientation and a motion of the mobile computing device, and a processor receiving the input signal from the input interface, activating a scrolling operation of the display and scrolling the display of the device based on the one of the sensed motion and the sensed orientation of the device. | 06-18-2009 |
20090153467 | System and method for connection detection - A system and method for detecting a display connection. The system includes a transmitter operable to be coupled to a receiver. The transmitter includes a controller operable to generate a connect signal when the receiver is coupled to the controller and is further operable to generate a disconnect signal when the receiver is uncoupled from the controller. The transmitter further includes a detection component operable to detect the connect signal and is further operable to detect the disconnect signal. | 06-18-2009 |
20090153468 | Virtual Interface System - The invention relates to a virtual interface system, to a method of providing a virtual interface, and to a data storage medium having stored thereon computer code means for instructing a computer system to execute a method of providing a virtual interface. A virtual interface system comprises a camera; a processor coupled to the camera for receiving and processing video data representing a video feed captured by the camera; a display coupled to the processor and the camera for displaying first and second interface elements superimposed with the video feed from the camera in response to display data from the processor, the second interface element being displayed at a fixed location on the display; wherein the processor tracks a motion action of a user based on the video data received from the camera, controls a display location of the first interface element on the display based on the tracked motion action; and determines a user input based on a relative position of the first and second interface elements on the display. | 06-18-2009 |
20090153469 | Input Device and Handheld Electronic Device - An input device and a handheld electronic device comprising the input device are provided. The input device comprises an elastic sheet and a switch sheet, both with alignment through-holes for precise alignment. Thereby, a protrusion of the elastic sheet is disposed precisely above a switch of the switch sheet. Because of the inerrable assembly of the input device, the handheld electronic device will have an acute response to the depression made by the user. | 06-18-2009 |
20090153470 | ELECTRONIC DEVICE WITH MODULE INTEGRATING DISPLAY UNIT AND INPUT UNIT - An electronic device includes a housing, a main display unit, an auxiliary display module and a processor. The main display unit and the auxiliary display module are both disposed on the housing. The auxiliary display module includes an auxiliary display unit and an input unit. The auxiliary display unit is used for displaying a user interface, and the input unit is used for receiving a user input. The processor is coupled to the input unit, and performs a corresponding function according to the user input. | 06-18-2009 |
20090153471 | Apparatus and method for inputting characters - The present invention provides an apparatus and method for inputting characters, which allow various characters to be inputted to a multimedia device. The character input apparatus preferably includes a user manipulation unit for generating one or more stroke signals and/or one or more arc signals, and a character detection unit for detecting one or more characters using the stroke signals and/or the arc signals. The stroke signals and/or the arc signals are signals corresponding to strokes and/or arcs constituting respective characters, and the characters are data about the characters including the strokes and/or the arcs corresponding to the stroke signals and/or the arc signals. | 06-18-2009 |
20090153472 | CONTROLLING A VIEWING PARAMETER - The invention relates to a method ( | 06-18-2009 |
20090160760 | MULTI-FUNCTIONAL PRESENTER - A multi-functional presenter includes a case, a control unit in the case, an operating unit on the case and connected to the control unit, a pointer unit connected to the control unit, a bidirectional wireless communication unit connected to the control unit and linked with an external electronic device, a voice receiving unit connected to the control unit for receiving voice signals, a reminding unit connected to the control unit for receiving a warning signal from the external electronic device and giving a reminder to a user, and a power supply unit connected to the control unit. With the voice receiving unit and the bidirectional wireless communication unit, the multi-functional presenter performs not only the functions of paging up/down, pointing images, giving the user a reminder, etc., but also enables on-site voice recording and storing the recorded voice signals on the external electronic device. | 06-25-2009 |
20090160761 | METHOD AND HANDHELD ELECTRONIC DEVICE INCLUDING FIRST INPUT COMPONENT AND SECOND TOUCH SENSITIVE INPUT COMPONENT - A handheld electronic device includes a housing having a surface; a first input component having input members disposed external to the surface; a second touch sensitive input component disposed about the input members, the touch sensitive input component being separate and distinct from the input members and the first input component and being structured to provide one of: a contact point with respect to the surface responsive to actuation of a first number of the input members, and a number of responses responsive to actuation of a second number of the input members. A processor cooperates with the first input component and the touch sensitive input component to determine if a plurality of the input members are actuated contemporaneously and to output a representation of a single one of the input members based upon one of: the contact point, and the number of responses. | 06-25-2009 |
20090160762 | User input device with expanded functionality - The present invention can include systems and methods for expanding the functionality of user input devices. In particular, the present invention can expand the functionality of user input devices by changing the functions assigned to hardwired user input mechanisms responsive to user actuation of a function-change user input mechanism and/or responsive to automatic detection of an application change. Each hardwired user input mechanism can have an associated function indicator that visually indicates the function assigned to the hardwired user input mechanism. The present invention also can change the function indicated by the function indicators when there is a change in the functions assigned to the hardwired user input mechanisms. | 06-25-2009 |
20090160763 | Haptic Response Apparatus for an Electronic Device - A user input for an electronic device includes a haptic feedback layer ( | 06-25-2009 |
20090160764 | Remote Control System - A remote control system, comprising a hand-operated remote control device and a control menu present in a display unit. Picking up the remote control device by hand activates a motion or position or push-button controlled user interface in the display unit. The remote control device is provided with an identification feature for its pick-up by hand, which activates the control menu and/or the remote control device. The motional and/or positional handling of the remote control device enables controlling the display unit's menus included in the user interface. | 06-25-2009 |
20090160765 | INPUTTING UNIT, INPUTTING METHOD, AND INFORMATION PROCESSING EQUIPMENT - According to one embodiment, an inputting unit for inputting a first control instruction to change a display status of a screen, includes: an acceleration acquiring section configured to acquire an acceleration applied to the inputting unit; an acceleration determining section configured to output a first signal when the acceleration exceeds a certain numerical value; a user's instruction acquiring section configured to allow a user to input a second signal at the inputting unit at any point of time; and a focus change determining section configured to output a second control instruction to move a focus in the screen when the first signal and the second signal are input. | 06-25-2009 |
20090167677 | Method and Apparatus for Providing Communicatons with Haptic Cues - A method and apparatus of generating haptic cues for pacing and monitoring are disclosed. After sensing an event via a component, a process for generating haptic cues generates an input in response to the event. The component, in one example, may be a sensor or a combination of a sensor and a haptic actuator. Upon receipt of the input, the process retrieves a haptic signal from a tactile library in response to the input. A haptic feedback in response to the haptic signal is subsequently generated. | 07-02-2009 |
20090167678 | SYSTEM AND METHOD FOR NAVIGATING A MOBILE DEVICE USER INTERFACE WITH A DIRECTIONAL SENSING DEVICE - An electronic mobile device includes a display, a tilt sensor and a processor. The display is for displaying a graphical element. The tilt sensor is configured to measure a tilt angle of the mobile device. The processor is configured to store the measured tilt angle as a reference tilt angle, subsequently determine a delta tilt angle as the difference between a currently measured tilt angle and the reference tilt angle, compare the delta tilt angle to different thresholds, and alter the position of the displayed element on the display at a rate that is based on the number of the thresholds the delta tilt angle has exceeded. | 07-02-2009 |
20090174652 | Information processing system, entertainment system, and information processing system input accepting method - A technique is provided related to an input interface wherein entertainment is enhanced by an information processing system that includes means for producing a computer image that prompts a player to virtually touch a plurality of touch points; means for accepting input of a video image of the player captured by image pickup means; display control means for causing a display device to display and superimpose the video image and the computer image on each other; means for analyzing the video image during display of the computer image to detect virtual touches of any of the plurality of touch points; and means for executing predetermined processing when the detecting means detects virtual touches that are performed on a predetermined number of touch points in a predetermined order. | 07-09-2009 |
20090174653 | METHOD FOR PROVIDING AREA OF IMAGE DISPLAYED ON DISPLAY APPARATUS IN GUI FORM USING ELECTRONIC APPARATUS, AND ELECTRONIC APPARATUS APPLYING THE SAME - An electronic apparatus to provide an area of an image displayed on a display apparatus in a GUI form. The electronic apparatus transfers a user command related to an external apparatus to the external apparatus, and displays an area of an image displayed on the external apparatus on a display. Therefore, it is possible to display an area of an image displayed on a display apparatus in the GUI form using another display apparatus so that the user may select a desired GUI item more conveniently and more intuitively. | 07-09-2009 |
20090174654 | COMPUTERIZED INTERACTOR SYSTEMS AND METHODS FOR PROVIDING SAME - A computerized interactor system uses physical, three-dimensional objects as metaphors for input of user intent to a computer system. When one or more interactors are engaged with a detection field, the detection field reads an identifier associated with the object and communicates the identifier to a computer system. The computer system determines the meaning of the interactor based upon its identifier and upon a semantic context in which the computer system is operating. The interactors can be used to control other systems, such as audio systems, or it can be used as intuitive inputs into a computer system for such purposes as marking events in a temporal flow. The interactors, as a minimum, communicate their identity, but may also be more sophisticated in that they can communicate additional processed or unprocessed data, i.e. they can include their own data processors. The detection field can be one-dimensional or multi-dimensional, and typically has different semantic meanings associated with different parts of the detection field. | 07-09-2009 |
20090179853 | Method of employing a gaze direction tracking system for control of a computer - A method of employing a gaze direction tracking system for control of a computer comprises the steps of: providing a computer display incorporating a screen and at least one off-screen control target ( | 07-16-2009 |
20090179854 | DYNAMIC INPUT GRAPHIC DISPLAY - An input device for providing dynamic displays is disclosed. The input device can modify the appearance and/or location of graphics associated with an input area of a device. For example, the input device can have a button layout that shifts based on the orientation of the electronic device relative to the user, such that the button layout is consistently presented to the user in an upright orientation. The input device can rotate and/or rename a button input area region depending on the context of an application running on the electronic device. The input device can display dynamic graphic content in an input area which is distinct from a display screen of the electronic device. | 07-16-2009 |
20090179855 | Multifunctional Operating Device and Method - A multifunctional operating device is provided that includes a display unit, at least one operating element, to which a function can be assigned within a menu structure and can be selected when the respective operating element is actuated, and associated visual information on the function selected by the operating element. The visual information is displayed on the screen during at least one operating status of the operating element. The time during which the associated visual information is displayed depends on at least one display parameter which can be varied by way of an adjusting intervention. | 07-16-2009 |
20090184920 | Two element slider with guard sensor - A method for using a slider-based capacitive sensor to implement a user interface having discrete buttons. Button locations are designated on a slider-based capacitive sensor having at least two conductive traces such that a user input at any button location results in a capacitance change in the conductive traces. Locations of inputs are distinguishable by ratios between the capacitance changes of the conductive traces, which can be correlated to a particular button location. Ratio ranges corresponding to areas covered by each button are used to identify which button has received an input. | 07-23-2009 |
20090184921 | Input Through Sensing of User-Applied Forces - Methods and devices for providing a user input to a device through sensing of user-applied forces are described. A user applies forces to a rigid body as if to deform it and these applied forces are detected by force sensors in or on the rigid body. The resultant force on the rigid body is determined from the sensor data and this resultant force is used to identify a user input. In an embodiment, the user input may be a user input to a software program running on the device. In an embodiment the rigid body is the rigid case of a computing device which includes a display and which is running the software program. | 07-23-2009 |
20090184922 | Display indicator controlled by changing an angular orientation of a remote wireless-display controller - A remote controller for controlling a presentation image is disclosed. The remote controller includes gyroscope to detect the movement and angular speed of the remote controller and generate corresponding signals for transmitting to a computer or projection system. The movement or angular speed signals are processed and applied to move a display cursor or highlight indicator in different areas of the display image according to the movements and angular speed and positions of the remote controller thus enhancing the control the image of the presentation without requiring the presenter to look away from the screen in search of many different push buttons to control the presentation images. | 07-23-2009 |
20090184923 | Haptic Stylus Utilizing An Electroactive Polymer - Haptic feedback interface devices using electroactive polymer (EAP) actuators to provide haptic sensations. A haptic feedback interface device is in communication with a host computer and includes a sensor device that detects the manipulation of the interface device by the user and an electroactive polymer actuator responsive to input signals and operative to output a force to the user caused by motion of the actuator. The output force provides a haptic sensation to the user. In an embodiment, a stylus including a body having a first end and a second end opposite from the first end, a moveable member coupled to the body and capable of being in contact with a user's hand; and an electro active polymer actuator coupled to the moveable member, wherein the electroactive polymer moves the moveable member from a first position to a second position with respect to the body upon being activated. | 07-23-2009 |
20090189852 | INDEX WHEEL HAVING NOTATIONS AND METHOD OF MANUFACTURE THEREOF - An index wheel having notations adopted on an electronic device to be freely rotated to generate signal command output includes at least an inner axis layer and an operation layer encasing the inner axis layer in a coaxial manner and movable together at the same time. The operation layer is made from a ceramic material and formed integrally by molding and sintering. The operation layer also has a graphic notation zone on the surface that contains desired graphics or texts. Thus the profile of the index wheel is more versatile and provides a greater appeal to users. | 07-30-2009 |
20090189853 | CHARACTER INPUT DEVICE - A character input device is disclosed. In one embodiment, the device includes i) a base including an input region, ii) two input units disposed in the input region, wherein each of the input units is disposed to perform direction inputs of more than two steps that selects any one of a plurality of direction instruction locations that are radially spaced apart from each other from each of reference locations within the input region in each of the direction instruction locations, iii) a direction input detecting unit detecting whether the direction inputs and a multiple input are performed and iv) a controller discriminating and inputting a first character that is redundantly allocated at the corresponding direction instruction location according to the direction instruction location in which the direction inputs are performed and whether the multiple input is performed. The character input device having the construction described above includes two sets of input units that input more than one phoneme at an operation, thereby doubling input quantity and simultaneously inputting a character quickly and accurately. | 07-30-2009 |
20090189854 | USER INTERFACE CONTROLLER FOR A COMPUTER - User interface controller for controlling a computer comprising:
| 07-30-2009 |
20090195497 | GESTURE-BASED POWER MANAGEMENT OF A WEARABLE PORTABLE ELECTRONIC DEVICE WITH DISPLAY - Methods and systems for providing gesture-based power management for a wearable portable electronic device with display are described. An inertial sensor is calibrated to a reference orientation relative to gravity. Motion of the portable device is tracked with respect to the reference orientation, and the display is enabled when the device is within a viewable range, wherein the viewable range is a predefined rotational angle range in each of x, y, and z axis, to a user based upon a position of the device with respect to the reference orientation. Furthermore, the display is turned off if an object is detected within a predetermined distance of the display for a predetermined amount of time. | 08-06-2009 |
20090195498 | Signal Generator Providing ISI Scaling to Touchstone Files - A device and method for producing Inter Symbol Interference (ISI) scaling of S-Parameter Touchstone files for the generation of ISI scaling effects on serial data patterns by direct digital synthesis is described. The features of the present invention allow user to set parameters such as data rate, voltage amplitude, encoding scheme etc. as per requirements for the serial data patterns. An ISI scaling value is selected and applied to an S-Parameter Touchstone file representing transmission path effects. The serial data pattern parameters and the ISI scaling value used with the S-Parameter Touchstone file are compiled to generate a digital data waveform record file. The digital waveform record file is applied to a waveform generation circuit for converting the digital data into an analog serial data pattern with ISI scaling effects. | 08-06-2009 |
20090201246 | Motion Compensation for Screens - A method for compensating for motion on screens is provided. In one embodiment, the method includes varying the display of a screen on a device using motion data. In this embodiment, a display adjustment amount also may be determined using screen properties and motion limits. In another embodiment, the method includes varying the location of an input region on a touch screen using touch data. In yet another embodiment, the method includes scaling selectable images and apportioning the display using motion data. Various additional methods, machine-readable media, and systems for motion compensation of a screen are also provided. | 08-13-2009 |
20090201247 | Communications with a Haptic Interface Device from a Host Computer - The present invention comprises methods and apparatuses that can provide reliable communications between a computer and a haptic interface device. The methods and apparatuses can provide communication that is more secure against errors, failures, or tampering than previous approaches. Haptic devices allow a user to communicate with computer applications using the user's sense of touch, for example by applying and sensing forces with the haptic device. The host computer must be able to communicate with the haptic device in a robust and safe manner. The present invention includes a novel method of accomplishing such communication; a computer-readable medium that, when applied to a computer, causes the computer to communicate according to such a method; and a computer system having a host computer and a haptic device communicating according to such a method. | 08-13-2009 |
20090207129 | Providing Haptic Feedback To User-Operated Switch - Systems and methods are disclosed herein for generating haptic feedback, tactile feedback, or force feedback to an electromechanical switch that is toggled by a user. In one specific example among many possible embodiments, a switch feedback system is disclosed. The switch feedback system comprises a user-operated switch, which is operable to toggle between one of an open state and a closed state. The switch feedback system also includes electrical circuitry in electrical communication with the user-operated switch, wherein the electrical circuitry is configured to react to a change of state of the user-operated switch. The system also includes a haptic feedback device in electrical communication with the user-operated switch and in physical communication with the user-operated switch. The haptic feedback device is configured to detect the change of state of the user-operated switch and provide a haptic feedback to the user-operated switch in response to the detected change of state. | 08-20-2009 |
20090207130 | INPUT DEVICE AND INPUT METHOD - The present invention discloses an input device and an input method. The input device comprises: a device for receiving input signals; and a processor circuit for generating control information according to comparison between a first difference between two input signals in a first direction and a second difference between two input signals in a second direction. | 08-20-2009 |
20090207131 | ACOUSTIC POINTING DEVICE, POINTING METHOD OF SOUND SOURCE POSITION, AND COMPUTER SYSTEM - There is disclosed an acoustic pointing device that is capable of performing pointing manipulation without putting any auxiliary equipment on a desk. The acoustic pointing includes a microphone array that retains plural microphone elements; an A/D converter that converts analog sound pressure data into digital sound pressure data; a buffering that stores the digital sound pressure data; a direction of arrival estimation unit that executes estimation of a sound source direction of a transient sound based on a correlation of the sound between the microphone elements obtained by the digital sound pressure data; a noise estimation unit that estimates a noise level in the digital sound pressure data; an SNR estimation unit that estimates a rate of a signal component based on the noise level and the digital sound pressure data; a power calculation unit that computes and outputs an output signal from the rate of a signal component; an integration unit that integrates the sound source direction and the output signal to specify a sound source position; and a control unit that converts, based on data in a DB of screen conversion, the specified sound source position into one point on a screen of a display device. | 08-20-2009 |
20090207132 | DISPLAY DEVICE, IMAGE FEEDING DEVICE, DISPLAY SYSTEM, PROGRAM, AND INFORMATION STORAGE MEDIUM - A display device includes: a mute deciding unit that decides whether mute designation for an image has been changed; a displaying side communication unit that receives image information from an image feeding device and that if the mute designation has been changed from invalidation to validation, transmits suspension-of-feed instructing information to the image feeding device; and a displaying side display unit that displays an image on the basis of the image information and that if the mute designation has been changed from invalidation to validation, ceases display of the image. | 08-20-2009 |
20090213066 | ONE BUTTON REMOTE CONTROL WITH HAPTIC FEEDBACK - An input system for a TV remote control or other system has a single touch surface with a deformable haptic assembly below the touch surface such that a user placing a finger on the touch surface can feel deformation of the haptic assembly. A pressure sensing assembly is below the haptic assembly and sensing motion of a finger on the touch surface, with a processor receiving input from the pressure sensing assembly and providing output to the haptic assembly in response. Also, a display receives input sent by the processor in response to input from the pressure sensing assembly to cause the display to present a changing image of a keypad as a user moves a finger on the touch surface. | 08-27-2009 |
20090213067 | INTERACTING WITH A COMPUTER VIA INTERACTION WITH A PROJECTED IMAGE - Embodiments of the present invention address deficiencies of the art in respect to user interfaces and provide a novel and non-obvious system for interacting with a computer via a projected image. In one embodiment of the invention, the system includes a projector for generating a projected image onto a surface, wherein the projected image corresponds to a first image on a display of the computer. The system further includes a sensor for sensing a human interaction with the projected image and generating a first information representing the human interaction and a transmitter for transmitting the first information to the computer. The system further includes a program on the computer that receives the first information and translates it into a second information representing a human interaction with the first image. | 08-27-2009 |
20090213068 | Ergonomic Pointing Device - An input device for a computer is described that positions the user's hand in a more ergonomically desirable position, i.e., at an angle of about 45° to the work surface. In preferred embodiments, the input device accommodates either a user's left or right hand, and in either case, positions the hand in an ergonomically desirable position. In another embodiment, the length of the input device of the present invention is adjusted for the size of the user's hand. In further embodiments, the input device of the present invention provides a palm rest. Other desirable features included in preferred embodiments include lateral buttons that are positioned one above the other. | 08-27-2009 |
20090219246 | TERMINAL DEVICE, TERMINAL SYSTEM AND COMPUTER-READABLE RECORDING MEDIUM RECORDING PROGRAM - A terminal device displays an image such that an operation part is positioned on a right side of the terminal device when image information is first-kind information indicating that an operation part is on a right side or a left side and a right-hand operation mode in which the operation part is operated by a user's right hand is set. A terminal device displays an image such that an operation part is positioned on a left side of the terminal device when image information is the first-kind information and a left-hand operation mode in which the operation part is operated by a user's left hand is set. A terminal device displays an image such that an operation part is positioned on a user's side of the terminal device when image information is second-kind information indicating that the operation part is on a deep side of the terminal device. | 09-03-2009 |
20090231269 | INPUT DEVICE, SIMULATED EXPERIENCE METHOD AND ENTERTAINMENT SYSTEM - A retroreflective sheet | 09-17-2009 |
20090231270 | CONTROL DEVICE FOR INFORMATION DISPLAY, CORRESPONDING SYSTEM, METHOD AND PROGRAM PRODUCT - The invention concerns a control device for an information display including a video sensor for receiving image information a processor for determining control information within said image information, wherein subsequent data transmission between said control device and said information display is dependent upon the origin of the received image information as determined by the control information. In addition, a system, a method and a program product are also targets of the present invention. | 09-17-2009 |
20090231271 | Haptically Enabled User Interface - A device has a user interface that generates a haptic effect in response to user inputs or gestures. In one embodiment, the device receives an indication that the user is scrolling through a list of elements and an indication that an element is selected. The device determines the scroll rate and generates a haptic effect that has a magnitude that is based on the scroll rate. | 09-17-2009 |
20090231272 | VIRTUAL HAND: A NEW 3-D HAPTIC INTERFACE AND SYSTEM FOR VIRTUAL ENVIRONMENTS - The present invention discloses a creation of a virtual hand, body part, or tool in a virtual environment, controlled by a new 3-D haptic interface for virtual environments. There is also the provision of a method and arrangement applicable to computer systems for creating a virtual environment, which facilitates a user to touch, feel, edit, and interact with data about the objects, surfaces, and textures in the environment. There is also a provision for multiple virtual hands operating in the same virtual world, so that users can work collaboratively in the virtual world; they can touch, feel and edit the data in the virtual world, including data about the other users' virtual hands. | 09-17-2009 |
20090231273 | MIRROR FEEDBACK UPON PHYSICAL OBJECT SELECTION - A highlighting method and an interaction system ( | 09-17-2009 |
20090231274 | Tilt Roller for Control Device - A control device includes a roller configured to rotate and tilt; a roller support coupled to the roller, wherein the roller is configured to rotate relative to the roller support; the first hinge disposed adjacent to a first end of the roller support; and a second hinge disposed adjacent to a second end of the roller support, wherein the first end and the second end are substantially opposite ends of the roller support, the second hinge is above the first hinge, and the first hinge and the second hinge are configured to provide tilting support for the roller and roller support. | 09-17-2009 |
20090237353 | Computer display capable of receiving wireless signals - A computer display capable of receiving wireless signals include a computer display provided with an antenna assembly for receiving wireless digital and analog A/V signals; and a female connector provided in the display for connecting the signals received via the antenna assembly to an external demodulating device, which is connected at a male connector thereof to the female connector in the display. The signals received by the antenna assembly is demodulated and converted by the external demodulating device, and transmitted to a personal computer via a transmission device for operation. The external demodulating device and the antenna assembly have a uniform specification for use with a computer display having the same specification to demodulate wireless signals. | 09-24-2009 |
20090237354 | OPTICAL APPARATUS AND OPTICAL SYSTEM - An optical apparatus has a projection device that projects a projection image onto a projection area, and a capture device that is arranged so that the position of a principal point of the projection device optically corresponds with that the position of a principal point of the capture device, and captures the projection area. At least one of the projection device and the capture device has an optical system with a tilt angle, and the projection device and the capture device are arranged so that a line passing the center of an angle of field of the projection device and a line passing the center of an angle of field of the capture device correspond with each other. | 09-24-2009 |
20090237355 | HEAD TRACKING FOR VIRTUAL REALITY DISPLAYS - A tracking device for determining position of at least one user relative to a video display has a wearable structure configured to be mounted on a human such a as a headset, eyeglasses or arm bands. The structure has two clusters of light emitting components which are spaced apart from one another. The LEDs in each cluster can emit different wavelengths of lines and be activated in sequences to identify not only the position of the user but also to distinguish one user from another user. | 09-24-2009 |
20090243997 | Systems and Methods For Resonance Detection - Systems and methods for resonance detection are disclosed. For example, one method for resonance detection includes the step of transmitting an actuator signal to an actuator coupled to a surface of a device. The actuator signal is configured to cause the actuator to output a force to the surface. The method further includes the steps of receiving a response of the surface to the force; determining a resonant frequency of the surface based at least in part on the response; and outputting a signal indicative of the resonant frequency. | 10-01-2009 |
20090243998 | APPARATUS, METHOD AND COMPUTER PROGRAM PRODUCT FOR PROVIDING AN INPUT GESTURE INDICATOR - An apparatus, method and computer program product are provided for providing an input gesture indicator. Upon detecting one or more tactile inputs, an electronic device may determine one or more characteristics associated with the tactile input(s) (e.g., number, force, hand pose, finger identity). In addition, the electronic device may receive contextual information associated with the current state of the electronic device (e.g., current application operating on the device). Using the characteristic(s) determined and the contextual information received, the electronic device may predict which operations the user is likely to request, or commands the user is likely to perform, by way of a finger gesture. Once a prediction has been made, the electronic device may display an indicator that illustrates the gesture associated with the predicted operation(s). The user may use the indicator as a reference to perform the finger gesture necessary to perform the corresponding command. | 10-01-2009 |
20090243999 | DATA PROCESSING DEVICE - A Hard Disk Drive HDD stores a program for causing a CPU to execute driver seat processing as processing corresponding to presence coordinates (x(k), y(k)) of an object when the presence direction of the object determined on the basis of detection values of the proximity sensors is the direction to the driver seat and passenger seat processing as processing corresponding to the presence coordinates (x(k), y(k)) of the object when the presence direction of the object is the direction to the passenger seat. The HDD also stores driver seat operation item data, passenger seat operation item data, a driver seat operation table, a passenger seat operation table, and the like. | 10-01-2009 |
20090244000 | USER INTERFACE FOR INTEGRATING DIVERSE METHODS OF COMMUNICATION - An integrated communication interface is provided for composing and sending messages. The interface is multi-configurable to seamlessly switch between different communication methods, e.g., electronic mail, instant messaging, SMS, chat, voice, and the like, without loss of message content. The interface allows a user to begin composing a message to be sent using one communication method, such as electronic mail, and subsequently change the communication method and send the message via a second communication method, such as instant messaging. When the communication method is changed, the user interface may also change to include elements specific to a particular communication method. The integrated communication interface may display information about participants in the communication, such as the participants' presence, i.e., whether they are online and available for communication, and may automatically choose the best method of communication based on the preferences and online presence of the participants. | 10-01-2009 |
20090244001 | BACK PLATE, DISPLAY, DISPLAY SYSTEM, METHOD OF SUPPLYING AN ELECTRIC POWER, AND DISPLAY METHOD - According to one embodiment, a back plate includes: a base plate including a principal surface on which a plurality of power supplying portions, a plurality of image signal transmitting portions, and a position detecting portion are disposed, each of the position detecting portions being configured to detect a position of a display device having a position marker; and a controller including: a detector configured to detect position information and attitude information of the display device, a selector configured to select at least one of power supplying portions and at least one of image signal transmitting portions, a power supply controller configured to supply an electric power to the selected power supplying portion, an image signal generator configured to produce image signal, and an image signal supply controller configured to supply the produced image signal to the selected image signal transmitting portion. | 10-01-2009 |
20090244002 | Method, Device and Program for Controlling Display, and Printing Device - A display control method for displaying a screen which receives a user-selected option from a predetermined option group through predetermined option presentation, the method includes: detecting a face image area which at least includes a user face in an image area; and determining an initial position of the predetermined option presentation in accordance with the face image area. | 10-01-2009 |
20090251406 | System and Method for Selective Activation and Deactivation of an Information Handling System Input Output Device - An information handling system has a chassis and a lid, the lid rotating between a closed position and a tablet position. An indicator coupled to the lid, such as a magnet, aligns with a first detector coupled to a chassis of the information handling system, such as a Hall effect detector, in the tablet position so that the detector signals to a position detector module to disable a keyboard of the information handling system. The indicator aligns with a second detector coupled to the chassis in a closed configuration so that the second detector signals to the position detector module to enter a power down state. | 10-08-2009 |
20090251407 | DEVICE INTERACTION WITH COMBINATION OF RINGS - The claimed subject matter provides a system and/or a method that facilitates interacting with a device and/or data associated with the device. A computing device can display a portion of data. A ring component can interact with the portion of data to control the device by detecting at least one of a movement, a gesture, an inductance, or a resistance related to a user wearing the ring component on at least one digit on at least one hand. | 10-08-2009 |
20090251408 | IMAGE DISPLAY DEVICE AND METHOD OF DISPLAYING IMAGE - Under an LCD panel ( | 10-08-2009 |
20090251409 | MOBILE WIRELESS DISPLAY SOFTWARE PLATFORM FOR CONTROLLING OTHER SYSTEMS AND DEVICES - A wireless headset can incorporate a wireless communication controller that not only provides a video link to a host device, but also provides for control and management of a host device and other more devices. In this context, a host device may be any appropriate device that sources audio, video, text, and other information, such as a cell phone, personal computer, laptop, media player, and/or the like. | 10-08-2009 |
20090256800 | VIRTUAL REALITY SIMULATOR HARNESS SYSTEMS - The inventions are directed to assemblies for interfacing three-dimensional movements of a person to a virtual environment or to a remote environment. The harness assemblies maintain the user in a desired location with respect to the virtual reality system thereby allowing the virtual reality system to capture the movements of the user The assemblies include a frame subsystem, a pivot subsystem, a cable management subsystem, a compliance subsystem, a vertical motion subsystem, a centering adjustment subsystem, a support arm subsystem, and a human restraint subsystem. | 10-15-2009 |
20090256801 | SYSTEM AND METHOD THAT GENERATES OUTPUTS - The present invention relates to a system and method that generates outputs based on the operating position of a sensor which is determined by the biomechanical positions or gestures of individual operators. The system including a garment on which one or more than one sensor is removably attached and the sensors provide a signal based on the biomechanical position, movement, action or gestures of the person wearing the garment, a transmitter receiving signals from the sensors and sends signals to a computer that is calibrated to recognise the signals as representing particular positions that are assigned selected outputs. Suitably the outputs are audio outputs of an instrument, such as a guitar, and the outputs simulate the sound of a guitar that would be played when the biomechanical motion, action, gesture or position of the operator resembles those that would occur when an actual instrument is played. | 10-15-2009 |
20090262068 | SPACE EFFICENT SORTABLE TABLE - A sortable and space efficient graphical user interface and a system for the efficient display of sortable data are disclosed herein. The graphical user interface may include at least one column, at least one row and a data cell defined by the intersection of at least one column and at least one row. First and second data may be displayed in the data cell. A first header is associated with the first column and identifies the first data. A second header is associated with the first column and identifies the second data. In the system for displaying sortable data, a graphical user interface is displayed upon a graphical display. A table is displayed as at least a portion of the graphical user interface, the table having a column with a plurality of rows, each row displaying first and second data and a first header associated with a first data and a second header associated with the second data. | 10-22-2009 |
20090262069 | GESTURE SIGNATURES - Apparatus, systems, and methods may operate to present viewable content to a viewer on a display screen, receive a transmitted signature from a user interface device (UID) associated with the display screen (wherein the signature results from at least one gesture initiated by the viewer and detected by the UID), and compare the transmitted signature to a stored signature associated with a known individual to determine whether an identity associated with the viewer matches an identity associated with the known individual. Additional apparatus, systems, and methods are disclosed. | 10-22-2009 |
20090262070 | Method and System for Reducing Effects of Undesired Signals in an Infrared Imaging System - Effects of undesired infrared light are reduced in an imaging system using an infrared light source. The desired infrared light source is activated and a first set of imaging data is captured during a first image capture interval. The desired infrared light source is then deactivated, and a second set of image data is captured during a second image capture interval. A composite set of image data is then generated by subtracting from first values in the first set of image data corresponding second values in the second set of image data. The composite set of image data thus includes a set of imaging where data all infrared signals are collected, including both signals resulting from the IR source and other IR signals, from which is subtracted imaging in which no signals result from the IR course, leaving image data including signals resulting only from the IR source. | 10-22-2009 |
20090267892 | SYSTEM AND METHOD FOR GENERATING ENERGY FROM ACTIVATION OF AN INPUT DEVICE IN AN ELECTRONIC DEVICE - In this disclosure, a description of a system for providing feedback signals relating to input signals provided to an electronic device is provided. The system comprises: an input device; a transducer associated with the input device; and a feedback module to generate a feedback signal indicating activation of the input device on the electronic device based on signals from the input device. The input device may be a touchpad; the transducer may be a piezoelectric element; and the feedback module may cause the transducer to vibrate upon receiving an activation signal relating to activation of the input device. The feedback module may provide a voltage generated by the transducer during the activation of the input device to an energy storage circuit. | 10-29-2009 |
20090267893 | TERMINAL DEVICE - An operation estimating portion | 10-29-2009 |
20090267894 | OPERATIONAL OBJECT CONTROLLING DEVICE, SYSTEM, METHOD AND PROGRAM - An operational object controlling device including a motion detection unit, a motion obtaining unit, a motion feature quantities extraction unit, a template storage unit, an operational object motion storage unit, a motion feature quantities transform unit and an operational object motion obtaining unit. The motion obtaining unit obtains the user's motion detected by the motion detecting unit. The motion feature quantities extraction unit extracts the user's motion feature quantities from the obtained motion. The transform unit transforms the motion feature quantities by using a template obtained from the template storage unit. The motion feature quantities of the operational object are obtained from each of the temporal motion sequences of the operational object in the operational object motion storage unit. The operational object motion obtaining unit obtains one of the temporal motion sequences from the storage unit having the feature quantities close to the user's motion feature quantities. | 10-29-2009 |
20090273559 | GAME DEVICE THAT GENERATES A DISPLAY WITH A SIMULATED BODY IMAGE AND METHODS FOR USE THEREWITH - A game device includes a first receiver that receives body motion signals from a plurality of remote motion sensing device coupled to a user's body. A user data generation module generates simulated body image data. A processor executes a game application that generates display signals for display on a display device, wherein the display signals are generated based on the simulated body image data. | 11-05-2009 |
20090273560 | Sensor-based distributed tangible user interface - A distributed tangible user interface comprises compact, self-powered, tangible user interface manipulative devices having sensing, display, and wireless communication capabilities, along with one or more associated digital content or other interactive software management applications. The manipulative devices display visual representations of digital content or program controls and can be physically manipulated as a group by a user for interaction with the digital information or software application. A controller on each manipulative device receives and processes data from a movement sensor, initiating behavior on the manipulative and/or forwarding the results to a management application that uses the information to manage the digital content, software application, and/or the manipulative devices. The manipulative devices may also detect the proximity and identity of other manipulative devices, responding to and/or forwarding that information to the management application, and may have feedback devices for presenting responsive information to the user. | 11-05-2009 |
20090273561 | DEVICE CONTROL SYSTEM - A device control system allows a user-owned terminal to control a control providing device. A control requester in the terminal sends a terminal message including coordinate information of a display screen and an identifier of the terminal. In response, a control request response processor in the control providing device assigns user-controllable events to relative positions of coordinates recognized from the coordinate information, thus sending event detail information to the terminal. A display controller in the terminal displays on the display screen events in a display mode based on the event detail information. A control request processor in the terminal sends a control request message to the control providing device in response to a user operation of the events displayed on the display screen. A function executing unit in the control providing device receives the control request message and executes a function corresponding thereto. | 11-05-2009 |
20090278791 | MOTION TRACKING SYSTEM - A motion tracking system for tracking an object composed of object parts in a three-dimensional space. The system comprises a number of magnetic field transmitters; a number of field receivers for receiving the magnetic fields of the field transmitters; a number of inertial measurement units for recording a linear acceleration; a number of angular velocity transducers for recording angular velocities. The system further comprises a processor for controlling the transmitters and receiving signals coming from the field receivers and the inertial measurement unit; which processor contains a module for deriving orientation and/or position information of the constituent object parts of the object on the basis of the received signals. The processor is configured for intermittently controlling the transmitters transmit at a predetermined frequency, wherein the position and/or orientation information is derived by periodically calibrating the motion information coming from the inertial measurement unit with the motion information coming from the magnetic field receivers. | 11-12-2009 |
20090278792 | Identifying User by Measuring Pressure of Button Presses on User Input Device - In one embodiment, a method comprises receiving, by a user identifier circuit, a button pressure signature specifying a sequence of button pressure values sampled while a corresponding identified button of a user input device is pressed by a user; the user identifier circuit identifying the user of the user input device based on the button pressure signature; and the user identifier circuit outputting a message identifying the identified button and the identified user. | 11-12-2009 |
20090278793 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND MEDIUM RECORDING INFORMATION PROCESSING PROGRAM - To provide an information processing device, an information processing method, and a medium recording an information processing program that can be operated easily. | 11-12-2009 |
20090278794 | Interactive Input System With Controlled Lighting - An interactive input system comprises at least one imaging device capturing images of a region of interest, a plurality of radiation sources, each providing illumination to the region of interest and a controller coordinating the operation of the radiation sources and the at least one imaging device to allow separate image frames based on contributions from different radiation sources to be generated. | 11-12-2009 |
20090278795 | Interactive Input System And Illumination Assembly Therefor - An illumination assembly for an interactive input system comprises at least two proximate radiation sources directing radiation into a region of interest, each of the radiation sources having a different emission angle. | 11-12-2009 |
20090278796 | PROGRAM, INFORMATION STORAGE MEDIUM, DETERMINATION DEVICE, AND DETERMINATION METHOD - A determination device stores a plurality of pieces of reference data associated with a predetermined movement pattern of a controller, and determines whether or not output values that respectively have a given relationship with reference data have been output from an acceleration sensor in a predetermined order within an input reception period in which an input that moves the controller in the predetermined movement pattern is received. The determination device receives an input that moves the controller in the predetermined movement pattern and performs a game process when the determination device has determined that the output values that respectively have the given relationship with the reference data have been output from the acceleration sensor in the predetermined order within the input reception period. | 11-12-2009 |
20090284462 | COMPUTER WITH VIRTUAL KEYBOARD AND DISPLAY - A computer includes a host ( | 11-19-2009 |
20090284463 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, INFORMATION PROCESSING PROGRAM, AND MOBILE TERMINAL - An information processing apparatus includes a tap operation detecting unit configured to detect the number of tap operations for tapping a housing and a tapped position of the housing, a storage unit storing a plurality of application programs, an activated application table storing an application program to be activated in association with the tapped position of the housing and the number of tap operations, and a control unit configured to detect an application program corresponding to the tapped position and the number of tap operations with reference to the activated application table on the basis of the tapped position and the number of tap operations detected by the tap operation detecting unit, to read out the detected application program from the storage unit, and to activate the detected application program. | 11-19-2009 |
20090284464 | PROJECTION IMAGE DISPLAY APPARATUS - The projection image display apparatus includes an image light generator and a projection optics. The projection optics includes a reflection mirror. The projection image display apparatus includes an image capture device configured to capture an image of a user facing the projection surface, a first acquisition unit configured to acquire captured image data from the image capture device, a second acquisition unit configured to acquire sample data independently of the captured image data, and an image controller configured to control an image to be displayed on the projection surface on the basis of the captured image data and the sample data. | 11-19-2009 |
20090284465 | CAPACITIVE MOTION DETECTION DEVICE AND INPUT DEVICE USING THE SAME - When the mode is switched to a motion detection mode, using a changeover switch, the mode is switched to the motion detection mode by pressing the changeover switch. In this mode, motion detection is performed by moving a hand in an area to be operated. When the mode is switched from the motion detection mode to a normal mode, the hand is moved away from the area to be operated, or the changeover switch is again pressed. Moreover, when the hand is distant from a capacitive sensor, it is determined that a motion input operation is being performed. When the hand is close to the capacitive sensor, it is determined that no motion input operation is being performed, and thus the motion detection mode is changed. | 11-19-2009 |
20090289892 | SIMULATION OF WRITING ON GAME CONSOLES THROUGH THE USE OF MOTION-SENSING TECHNOLOGY - A method and system of utilizing a game console with motion sensing technology is provided. The present invention, in various implementations, provides for a method for generating one or more symbols in response to one or more gestures using an input device of a gaming system. The method comprises providing the input device being capable of generating one or more gesture signals in response to one or more gestures and being operable to select a mode of one or more operational states. The method also provides for generating one or more gesture signals corresponding to the one or more gestures, respectively; mapping the one or more generated gesture signals in relation to one or more symbols, respectively; and, transmitting the one or more symbols corresponding to the respective one or more gesture signals to an output. | 11-26-2009 |
20090289893 | Finger appliance for data entry in electronic devices - The appliance is adapted to be mounted on a finger having a finger tip and a finger pad. It includes a finger engaging portion having first and second side members including oppositely oriented, spaced arcuate members and a force application member extending between the first and second side members of the finger engaging portion and abutting the finger tip. The force application member has a blunt surface adapted to contact a key of an electronic device. The blunt surface may have one or more ribs to provide a high friction surface. The appliance defines an opening aligned with the finger pad, so as not to obstruct the finger pad when mounted on the finger. | 11-26-2009 |
20090289894 | ELECTRONIC APPARATUS AND THREE-DIMENSIONAL INPUT DEVICE THEREOF - An electronic apparatus and a three-dimensional input device thereof are disclosed. The electronic apparatus includes a display and the three-dimensional input device. The three-dimensional input device includes an infrared ray emitting/receiving unit, a cursor control unit and a radio frequency receiving unit. The infrared ray emitting/receiving unit is electrically connected to the display and includes an infrared ray emitter and an infrared ray receiver. The cursor control unit includes a reflective material and a radio frequency emitter. The radio frequency emitter is electrically connected to the display. | 11-26-2009 |
20090295711 | MOTION CAPTURE SYSTEM AND METHOD FOR THREE-DIMENSIONAL RECONFIGURING OF CHARACTERISTIC POINT IN MOTION CAPTURE SYSTEM - In an optical motion capture system, it is possible to measure spatially high-dense data by increasing the number of measuring points. In the motion capture system using a mesh marker, intersections of lines for the mesh marker are called nodes and the lines connecting the nodes are called edges. The system includes a plurality of cameras for capturing a two-dimensional image of the mesh marker by imaging a subject having the mesh marker, a node/edge detecting section for detecting node/edge information on the mesh marker from the two-dimensional image captured by the respective cameras, and a three-dimensional reconstructing section for acquiring three-dimensional position information of the nodes by using the node/edge information detected from the plurality of two-dimensional images captured by different cameras. | 12-03-2009 |
20090295712 | PORTABLE PROJECTOR AND METHOD OF OPERATING A PORTABLE PROJECTOR - A portable projector comprising a projection unit for projecting an image onto a projection surface. The portable projector comprises a stabilization unit for stabilizing the projecting of the image. The portable projector further comprises a light sensing unit for scanning a region surrounding the portable projector. | 12-03-2009 |
20090295713 | POINTING DEVICE WITH IMPROVED CURSOR CONTROL IN-AIR AND ALLOWING MULTIPLE MODES OF OPERATIONS - Cursor resolution of a device is based upon a user's gripping (or squeezing) of the device in one embodiment, in accordance with a user's natural usage patterns. In one aspect, a device in accordance with an embodiment of the present invention offers multiple modes of operation depending on its orientation (e.g., which side of the device is facing upward). A device in accordance with an embodiment of the present invention can be used as a mouse, a presentation device, a keyboard for text entry, and so on. In one aspect of the present invention, circular gesture based controls are implemented, specifically for repetitive type functions. | 12-03-2009 |
20090295714 | POWER CONSERVING SYSTEM FOR HAND-HELD CONTROLLERS - A manual controller operates through a wireless communication link with a computing device to manipulate images or symbols on a display associated with the computing device. An electrical power conserving system allows such a wireless controller to conserve electrical power as the controller operates with electrical power supplied by replaceable batteries or rechargeable battery packs. In preferred embodiments, electronic manual-contact sensing circuitry enables more rapid turnoff of the controller during periods of game play inactivity. This eliminates a long timeout period and allows electrical current drain only when the controller is actually being held by a user. Preferred embodiments of the electronic manual-contact sensing circuitry detect electrical resistance of a user's hands and thereby enables delivery of different amounts of electrical power as required. | 12-03-2009 |
20090295715 | MOBILE COMMUNICATION TERMINAL HAVING PROXIMITY SENSOR AND DISPLAY CONTROLLING METHOD THEREIN - A mobile communication terminal having a function of detecting a proximity touch and a display controlling method therein are disclosed. The present invention includes a touchscreen configured to display prescribed data, the touchscreen detecting a real touch or a proximity touch to a surface contact point, a proximity sensor outputting a proximity signal corresponding to a proximity position of a proximate object, and a controller controlling an implementation of an operation associated with the prescribed data displayed on the touchscreen according to the proximity signal detected by the proximity sensor. | 12-03-2009 |
20090303174 | CONTROL OF DUAL FUNCTION INPUT AREA - An apparatus is provided that may be a portable device, such as a portable personal computer, cellular phone, or other computing device. The apparatus includes an input device. The input device may include a first input area and a second input area. The first and second input areas may occupy the same area of the computing device, such as the second input area may be underneath the first input area. A controller is provided that includes a first control area and a second control area. The first and second control areas may also occupy the same area where second control area may be underneath first control area. In one embodiment, first control area may be responsive to a mechanical input and a second control may be responsive to a non-mechanical input. The second control area is configured to enable and disable the second input area of the input device. | 12-10-2009 |
20090303175 | Haptic user interface - This invention relates to a method, apparatuses and a computer-readable medium having a computer program stored thereon, the method, apparatuses and computer program using a haptic signal perceptible by a user contacting a user interface surface with an input means (device) to indicate a predetermined direction on the user interface surface. | 12-10-2009 |
20090303176 | METHODS AND SYSTEMS FOR CONTROLLING ELECTRONIC DEVICES ACCORDING TO SIGNALS FROM DIGITAL CAMERA AND SENSOR MODULES - An embodiment of a method for remotely controlling an electronic apparatus, performed by a processor of the electronic apparatus, comprises the following steps. Existence of an object in close proximity to the electronic apparatus is detected. A camera module of the electronic apparatus is turned on to capture a series of images. A control operation in response to the captured images is determined. The control operation is performed to an electronic device of the electronic apparatus. | 12-10-2009 |
20090303177 | ELECTRONIC DEVICE AND METHOD FOR SELECTING FUNCTIONS BASED ON ORIENTATIONS - The present invention provides an electronic device and a method for selecting functions based on orientations of the electronic device. The method includes: a) storing relationships between orientations and functions; b) fetching inductive signals based on the orientation of the electronic device; c) recognizing the current orientation according to the fetched signals; and d) if the orientation is altered, selecting a function corresponding to the altered orientation and displaying a corresponding interface. | 12-10-2009 |
20090303178 | DISPLAY CONTROL APPARATUS, DISPLAY CONTROL METHOD, AND PROGRAM - There is provided a display control apparatus including an external memory accommodating unit for accommodating a removable external memory; a database recognizing unit for recognizing a database stored in the external memory, the database being recorded with an image stored in the external memory and information related to the image in correspondence to each other; a display method setting unit for setting either display method of a stored first display method or a second display method of displaying the image stored in the external memory without using the database based on a recognition result of the database recognizing unit; and a display controlling unit for displaying the image stored in the external memory by the display method based on the set display method. | 12-10-2009 |
20090303179 | Kinetic Interface - A kinetic interface for orientation detection in a video training system is disclosed. The interface includes a balance platform instrumented with inertial motion sensors. The interface engages a participant's sense of balance in training exercises. | 12-10-2009 |
20090309825 | USER INTERFACE, METHOD, AND COMPUTER PROGRAM FOR CONTROLLING APPARATUS, AND APPARATUS - A user interface for a portable apparatus is disclosed. The user interface comprises a sensor arranged to determine a spatial change, wherein said user interface being arranged to control at least one function, wherein the function is controlled by said determined spatial change; an actuator arrangement; and at least one mass, wherein the actuator arrangement is arranged to controllably actuate at least one of the at least one mass by acceleration to by inertia of the actuated mass provide a force on the portable apparatus. Further, an apparatus, a method, and a computer program for controlling a function are disclosed. | 12-17-2009 |
20090309826 | Systems and devices - The present disclosure relates to systems and devices that may be configured to facilitate content projection. | 12-17-2009 |
20090309827 | METHOD AND APPARATUS FOR AUTHORING TACTILE INFORMATION, AND COMPUTER READABLE MEDIUM INCLUDING THE METHOD - The present invention relates to a method and apparatus for authoring tactile information that generates a tactile video for representing tactile information in the form of an intensity value of a pixel in order to author tactile information. More particularly, the present invention relates to a method and apparatus for authoring tactile information that represents an intensity value of a pixel of a tactile video in a drawing manner on a tactile video input window while outputting and referring to audiovisual media. The present invention provides an apparatus for authoring a tactile video that represents information about driving strength of an actuator array of a tactile display apparatus in the form of an intensity value of a pixel. The apparatus includes a tactile video generating unit that including a configuration module and a tactile video authoring module. The configuration module performs configuration to author a tactile video. The tactile video authoring module includes a video clip playback window that outputs information about audiovisual media, such as a video clip or a text, becoming a base of authoring the tactile video by frames, and a tactile video input window to which an intensity value of each of pixels of the tactile video is input in a drawing manner. The tactile video is generated by frames. | 12-17-2009 |
20090309828 | Methods and systems for transmitting instructions associated with user parameter responsive projection - The present disclosure relates to systems and methods that are related to transmitting and receiving instructions associated with user parameter responsive projection. | 12-17-2009 |
20090309829 | SIMPLE MULTIDIRECTIONAL KEY FOR CURSOR CONTROL - A multidirectional key is operative to press a specific one of multiple contacts, depending on the direction wherein the key's button disc is being moved. The fingers are projections extending from a ring, the combination being made out of sheet metal, and bent out of the plane of the disc to engage with the button when the latter is moved. As a result, the multidirectional key has a very simple configuration and is inexpensive to manufacture. | 12-17-2009 |
20090309830 | CONTROL APPARATUS, INPUT APPARATUS, CONTROL SYSTEM, HANDHELD INFORMATION PROCESSING APPARATUS, CONTROL METHOD, AND PROGRAM THEREFOR - A control apparatus, an input apparatus, a control system, a control method, and a program therefor that are capable of improving operability when a user operates a GUI displayed on a screen by a pointer using the input apparatus are provided. An MPU of a control apparatus sets weighting factors for each region sectioning a screen. The MPU multiplies the weighting factors to corresponding displacement amounts to independently calculate displacement amounts of a pointer on the screen. Accordingly, a movement direction of the pointer can be biased in a predetermined direction. Thus, when a user operates an input apparatus to select an icon aligned in a 1-dimensional direction on the screen, for example, an operation of the pointer can be restricted to that 1-dimensional direction. Therefore the user can easily select the icon, thus improving operability of the pointer. | 12-17-2009 |
20090315825 | INFRARED VIRTUAL, INVISIBLE COMPUTER KEYBOARD AND MOUSE - New design for an infrared virtual, invisible computer keyboard and mouse for a mice and keyboard less computer is presented. The current invention uses the fact that the infrared spectrum of human fingers can be changed when irradiated with low power diodes. Using this fact, the present invention presents a method where a human finger's infrared spectrum, irradiated by an array of infrared diodes, is picked up by infrared sensors installed either in stand alone mode, on top of the computer display or directly into the computer display. The computer then uses the finger infrared spectrums picked by the infrared sensors to created virtual, invisible mouse and keyboard where words can drawn in the air and mouse commands are also given by moving the finger in different directions in the air. | 12-24-2009 |
20090322671 | TOUCH SCREEN AUGMENTED REALITY SYSTEM AND METHOD - An improved augmented reality (AR) system integrates a human interface and computing system into a single, hand-held device. A touch-screen display and a rear-mounted camera allows a user interact the AR content in a more intuitive way. A database storing graphical images or textual information about objects to be augmented. A processor is operative to analyze the imagery from the camera to locate one or more fiducials associated with a real object, determine the pose of the camera based upon the position or orientation of the fiducials, search the database to find Graphical images or textual information associated with the real object, and display graphical images or textual information in overlying registration with the imagery from the camera. | 12-31-2009 |
20090322672 | INTERACTIVE DISPLAY - An interactive display ( | 12-31-2009 |
20100001947 | Images Display Method and Apparatus of Digital Photo Frame - The invention describes a digital photo frame which has a motion sensor module. The digital photo frame further comprises a microprocessor, a memory unit, and a display module. User can change photo images displayed by simply shaking the digital photo frame. Any acceleration movements, acceleration changes, gravity changes, tilt, shake, and position changes of the digital photo frame, are detected by the motion sensor module, and the motion sensor module generates signals to the microprocessor for further calculation and interpretation. The microprocessor will then change the displayed photo image to the next image or to the previous image according to the calculation result of the signals. | 01-07-2010 |
20100001948 | FOOT-OPERATED COMPUTER INPUT DEVICE - A foot operated data entry/input pad has a plurality of foot-operated buttons. The foot buttons may be used to enter data values, such as numbers or symbols separately or in combination. Each button is preferably capable of entering different data values, preferably depending on the length of time that it is pressed or on the number of times that it is pressed in succession. A small controller may be included to allow the user to control the computer's pointer, allowing the user to switch between data entry fields. A heel rest may serve as both a heel rest and a button/switch for sending an electric/electronic signal. An automated voice system, or other audible and/or visual indicator system, may help the user keep track of the data value as it changes and is entered. In alternative versions for input of instructions, single values or binary information, or for selection of items in a pull-down screen window, a pad may have two buttons provided adjacent a cursor controller, wherein the cursor controller and right and left click buttons are on an arc or on an angle. | 01-07-2010 |
20100001949 | Spatially Aware Inference Logic - A method, system, and article to support a motion based input system. Movement data is acquired from a motion sensor. An orientation detector detects orientation towards gravity from a rest position, and a motion detector detects motion, including movement and rest. In addition, an inference state machine in communication with the orientation and motion detectors maintains a sequence of the detected motion conditions, and produces a profile description for the sequence of the detected motion conditions. An output event corresponding to the profile description is generated based upon the profile. | 01-07-2010 |
20100001950 | POSITION DETERMINATION UTILIZING A CORDLESS DEVICE - A system for generating position information includes a reflector, an image collection system, and a processor. The image collection system is configured to collect at least two sets of image data, where one set of image data includes a stronger indication of the reflector than the other set of image data. The two sets of image data can be collected in many different ways and may include using a retroreflector as the reflector. The two sets of image data are used to generate position information related to the reflector. In particular, position information related to the reflector is generated by taking the difference between the two set of image data. Because one set of image data includes a stronger indication of the reflector than the other set of image data, the difference between the two sets of image data gives a definitive indication of the reflector's position. | 01-07-2010 |
20100007601 | GAZE INTERACTION FOR INFORMATION DISPLAY OF GAZED ITEMS - An interactive method and system include at least one detector ( | 01-14-2010 |
20100007602 | IMAGE DISPLAY DEVICE - An image display device easily displays a stereoscopically two-dimensional image, improving its direction effect and interactivity. A first display displays a first image on a first screen. An image transmission element in a light path for a display light component of the first image transmits the display light component of the first image, displaying a real image of the first image on an image forming surface positioned at a distance on a side opposite the first screen as a stray image. A second display displays a second image on a second screen as a directly visible image so the stray image is observable from an observation position. A position detecting element outputs a position signal corresponding to detected object position. A control element controls the first or second display according to the output position signal so the stray and/or directly visible image changes correspondingly with the position of the object. | 01-14-2010 |
20100013757 | IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD - The operator of an image processing apparatus can do the operation of searching for the desired image parts from a series of images also the editing operation with ease. The image processing apparatus includes an image generating means for generating display video data of a plurality of images to be displayed respectively in a plurality of image display sections on a display screen from image data, a display type determining means for determining display types indicating display modes of displaying the images of the image data on a picture by picture basis or GOP by GOP basis according to variations expressing extents of change of the image data, a parameter altering means for altering the display parameters or the reproduction parameters corresponding to the display video data according to the type information expressing the display types on a picture by picture basis or GOP by GOP basis as determined by the display type determining means and an image processing means for displaying the images to be displayed in the form of moving image on the display screen with time lags in the display sequence, using the display parameters or the reproduction parameters altered by the parameter altering means. | 01-21-2010 |
20100013758 | HUMAN INTERFACE DEVICE (HID) - A user may wear the radio frequency human interface device on a body portion and move the body portion over any even and un-even surface that is not touch sensitive to provide inputs. The radio frequency human interface device may sense, encode, and provide the radio frequency signals to a computing system. The computing system may be provisioned with a radio frequency reader that may receive the radio frequency signal and decode the radio frequency signal before responding to the input. Also, a plurality of users may use radio frequency human interface devices to provide inputs to the computing system concurrently. | 01-21-2010 |
20100013759 | KVM SWITCH WITH SEPARATE ON-SCREEN DISPLAY AND CONTROL CHANNELS - The present invention relates to a KVM switch with separate on-screen display and control channels, particularly to the KVM switch is connected to a plurality of computers, monitors and operation devices. The KVM switch comprises a switch circuit connected with the computers, a video switch circuit connected with the switch circuit for sending a signal to the monitors, a separate control system connected to the video switch circuit for receiving a function command from the operation devices, and a video control and display unit connected to the separate control system for producing and sending a video signal and synchronous signal to the monitors through the video switch circuit. Because the video control and display unit can transmit the video signal containing OSD (on-screen display) functions and display contents to the video switch circuit through a separate channel, and the separate control system also sends a switching signal to the video switch circuit to shut off the video signal form the computers, therefore, an OSD (on-screen display) image displayed on the monitors has fixed settings and facilitates a user to view and operate. | 01-21-2010 |
20100013760 | VOICE INPUT DEVICE - Provided is a voice input device in which a content spoken by a user is reflected on a confirmation screen when executing the content, thereby allowing the user to confirm that a command, corresponding to the content which is spoken by the user who intended for the execution, is executed after being recognized by the voice input device. The voice input device comprises: a command storage section ( | 01-21-2010 |
20100013761 | Systems And Methods For Shifting Haptic Feedback Function Between Passive And Active Modes - Systems and methods for shifting haptic feedback function between passive and active modes are disclosed. For example, one disclosed method includes receiving a first signal from a sensor, the first signal associated with a mode of interaction with a graphical user interface; receiving a second signal associated with an interaction with the graphical user interface; determining a haptic feedback effect based at least in part on the mode of interaction with the graphical user interface and the interaction with the graphical user interface; and generating a haptic signal configured to output the haptic feedback effect. | 01-21-2010 |
20100013762 | USER DEVICE FOR GESTURE BASED EXCHANGE OF INFORMATION, METHODS FOR GESTURE BASED EXCHANGE OF INFORMATION BETWEEN A PLURALITY OF USER DEVICES, AND RELATED DEVICES AND SYSTEMS - A user device is disclosed comprising;
| 01-21-2010 |
20100026623 | VELOCITY STABILIZATION FOR ACCELEROMETER BASED INPUT DEVICES - A method and apparatus for reducing or eliminating tracking errors associated with accelerometer-based input devices. The method and apparatus calculates a velocity of the input device; determines if the calculated velocity indicates a motion tracking error; and ignores the calculated velocity if the motion tracking error is indicated. | 02-04-2010 |
20100026624 | INTERACTIVE DIRECTED LIGHT/SOUND SYSTEM - An interactive directed beam system is provided. In one implementation, the system includes a projector, a computer and a camera. The camera is configured to view and capture information in an interactive area. The captured information may take various forms, such as, an image and/or audio data. The captured information is based on actions taken by an object, such as, a person within the interactive area. Such actions include, for example, natural movements of the person and interactions between the person and an image projected by the projector. The captured information from the camera is then sent to the computer for processing. The computer performs one or more processes to extract certain information, such as, the relative location of the person within the interactive area for use in controlling the projector. Based on the results generated by the processes, the computer directs the projector to adjust the projected image accordingly. The projected image can move anywhere within the confines of the interactive area. | 02-04-2010 |
20100026625 | CHARACTER INPUT DEVICE - A character input device is disclosed. In one embodiment, the character input device includes an input unit, press detection units, movement detection units, and a control unit. The input unit is provided as a single body such that first directional input, which is performed by pressing one of first direction indication locations arranged radially from a reference location and spaced apart from one another, and second directional input, which is performed through movement from each of the first direction indication locations to one of second direction indication locations arranged radially around the first direction indication location, can be performed. The press detection units detect the first directional input. The movement detection units detect the second directional input. The control unit extracts a character code, assigned to each selected one of the direction indication locations, from a memory unit based on results of the detection. | 02-04-2010 |
20100033422 | SYSTEMS AND METHODS FOR PROCESSING MOTION SENSOR GENERATED DATA - Systems and methods for processing data from a motion sensor to detect intentional movements of a device are provided. An electronic device having a motion sensor may process motion sensor data along one or more dimensions to generate an acceleration value representative of the movement of the electronic device. The electronic device may then determine whether the acceleration value changes from less than a low threshold, to more than a high threshold, and again to less than the low threshold within a particular amount of time, reflecting an intentional movement of the electronic device by the user. In response to determining that the acceleration value is associated with an intentional movement of the electronic device, the electronic device may perform a particular event or operation. For example, in response to detecting that an electronic device has been shaken, the electronic device may shuffle a media playlist. | 02-11-2010 |
20100033423 | Portable Electronic Apparatus and Input Operation Determining Method - A portable electronic apparatus | 02-11-2010 |
20100033424 | ELECTRONIC APPARTUS AND CONTROL METHOD THEREFOR - An electronic apparatus and a control method are provided that are capable of reducing power consumption. The electronic apparatus having a normal mode in which first electric power is consumed and a power-saving mode in which second electric power lower than the first electric power is consumed includes a first sensor and a second sensor whose power consumption is lower than that of the first sensor. In the power-saving mode, supply of power to the first sensor is restricted, the second sensor is set to the power-saving mode, a trigger for restoring the power-saving mode to the normal mode is detected by using the second sensor set to the power-saving mode, and the power-saving mode is restored to the normal mode based on the detected trigger. | 02-11-2010 |
20100033425 | COMPUTER PERIPHERAL DEVICE INTERFACE FOR SIMULATING MOUSE, KEYBOARD, OR GAME CONTROLLER ACTION AND METHOD OF USE - A computer peripheral device interface includes a microprocessor and unique coding /USB functionality to enable a foot, feet, hands, body computer peripheral device to have the capability to function as each of programmable mouse, joystick, and/or keyboard. The inventive peripheral device interface can include a communications channel to support various custom peripherals to be linked from or to the platform or other computer peripheral device, i.e. platform, pad, cycle, balance apparatus. | 02-11-2010 |
20100033426 | Haptic Enabled Gaming Peripheral for a Musical Game - A haptic enabled gaming peripheral that simulates a musical instrument includes a body, a first sensing element and a first actuator. A processor, located within the body of the gaming peripheral, communicates with a host computer running a software program corresponding to a musical game. The first sensing element, disposed within the body and coupled to the processor, senses an input from the user. The sensed input is communicated to the host processor. The first actuator, disposed within the body and coupled to the processor, outputs a haptic effect in response to receiving an activating signal based on an event that occurs in the software program. In some implementations, the first sensed element is disposed proximate to the first actuator so that the user perceives the haptic effect in response to providing the input. | 02-11-2010 |
20100033427 | Computer Image and Audio Processing of Intensity and Input Devices for Interfacing with a Computer Program - A method and system for determining an intensity value of an interaction with a computer program is described. The method and device includes capturing an image of a capture zone, identifying an input object in the image, identifying an initial value of a parameter of the input object, capturing a second image of the capture zone, and identifying a second value of the parameter of the input object. The parameter identifies one or more of a shape, color, or brightness of the input object and is affected by human manipulation of the input object. The extent of change in the parameter is calculated, which is the difference between the second value and the first value. An activity input is provided to the computer program, the activity input including an intensity value representing the extent of change of the parameter. A method for detecting an intensity value from sound generating input objects, and a computer video game are also described. | 02-11-2010 |
20100039371 | ARRANGEMENT FOR SELECTIVELY VIEWING AND HIDING USER INTERFACE ITEMS ON A BODY OF A COMMUNICATION APPARATUS, AND COMMUNICATION APPARATUS COMPRISING SUCH ARRANGEMENT - An arrangement for selectively viewing and hiding user interface items on a body of a communication apparatus is disclosed. The arrangement comprises a light source arranged inside the body and arranged to project light towards a surface of the body such that when the light source is in an on-state, the user interface item is viewed on the surface of the body; and a polarizing layer arranged with the surface of the body at least where the user interface item is to be viewed, such that the surface there, when the light source is in an off-state and the user interface item is hidden, appears in a primary color of the body. A communication apparatus comprising such arrangement is also disclosed. | 02-18-2010 |
20100039372 | MOBILE DEVICE DISPLAY IMPLEMENTATION - A mobile communication device contains a plurality of displays. A control module and one of the displays is contained by a first housing. Another housing, bearing a second display, is slidably engageable with the first housing. An optical data transmission mechanism is coupled between the control module and the second display. Data generated by the control module can be converted to optical signals and transmitted through the optical transmission path for control of the second display. | 02-18-2010 |
20100039373 | Hybrid Control Of Haptic Feedback For Host Computer And Interface Device - A hybrid haptic feedback system in which a host computer and haptic feedback device share processing loads to various degrees in the output of haptic sensations, and features for efficient output of haptic sensations in such a system. A haptic feedback interface device in communication with a host computer includes a device microcontroller outputting force values to the actuator to control output forces. In various embodiments, the microcontroller can determine force values for one type of force effect while receiving force values computed by the host computer for a different type of force effect. For example, the microcontroller can determine closed loop effect values and receive computed open loop effect values from the host; or the microcontroller can determine high frequency open loop effect values and receive low frequency open loop effect values from the host. Various features allow the host to efficiently stream computed force values to the device. | 02-18-2010 |
20100039374 | ELECTRONIC DEVICE AND METHOD FOR VIEWING DISPLAYABLE MEDIAS - A method adapted for an electronic device for viewing medias is provided. The method includes: detecting a motion of an electronic device; determining the motion of the electronic device is a first two control motion or a second two control motion; controlling a display unit to display a previous media or a next media if the motion of the electronic device is the first two control motion; controlling the display unit to display a media of a previous album or a next album if the motion of the electronic device is the second two control motion. | 02-18-2010 |
20100039375 | Signal Processing Method of Multi-Finger Touch Supported Touch Apparatus having Hidden Physical Button - A signal processing method of a multi-finger touch supported touch apparatus having hidden physical button is applied to the multi-finger touch supported touch apparatus having at least one physical button element pair. The method includes the steps of scanning a touch sensor of the multi-finger touch supported touch apparatus continuously, judging whether the physical button element pair is pressed if a total number of the fingers detected is larger than zero and driving a corresponding touch application according to the total number of fingers of a gesture and a button. The structure of the multi-finger touch supported touch apparatus which is stacked includes a Mylar, a first adhesive layer, a printed circuit board having a first part of the physical button element pair and the touch sensor at least, a second adhesive layer, a first metallic layer for holding and a second metallic layer having a second part of the physical button element pair. | 02-18-2010 |
20100039376 | SYSTEM AND METHOD FOR REDUCING POWER CONSUMPTION OF A DISPLAY DEVICE - A system and method for reducing power consumption of a display device initializes a counter as zero, controls a video camera to capture an image of an object that is in front of the display device, and determines whether any user's eyes are viewing the display device by analyzing facial features of the captured image. The system and method further controls the display device to work in a normal display mode if any user's eyes are viewing the display device, and increments the counter each second if no user's eyes are viewing the display device. Additionally, the system and method controls the display device to work in a display protection mode if the counter is more than a first predefined threshold number and less than a second predefined threshold number, and in a power reducing mode if the counter is not less than the second predefined threshold number. | 02-18-2010 |
20100039377 | System and Method for Controlling a Virtual Reality Environment by an Actor in the Virtual Reality Environment - A motion capture environment includes at least one sensor-tracker for tracking a location of a tracked object within the motion capture environment and one or more computers collectively operable to generate a virtual reality environment including a virtual control panel having a virtual control that, when actuated, effects a predetermined result in the virtual reality environment; determine a virtual location of the tracked object within the virtual reality environment; and determine when the virtual location of the tracked object coincides with the location of the virtual control to actuate the virtual control. The motion capture environment further includes a display device for displaying the virtual reality environment to an actor within the motion capture environment. | 02-18-2010 |
20100039378 | Information Processing Apparatus, Method and Program - An information processing apparatus includes an imaging unit, an icon display control unit causing a display to display an operation icon, a pickup image display processing unit causing the display to sequentially display an input operation region image constituted by, among pixel regions constituting an image picked up by the imaging unit, a pixel region including at least a portion of a hand of a user, an icon management unit managing event issue definition information, which is a condition for determining that the operation icon has been operated by the user, for each operation icon, an operation determination unit determining whether the user has operated the operation icon based on the input operation region image displayed in the display and the event issue definition information, and a processing execution unit performing predetermined processing corresponding to the operation icon in accordance with a determination result by the operation determination unit. | 02-18-2010 |
20100039379 | Enhanced Multi-Touch Detection - Enhanced multi-touch detection, in which a graphical user interface for an application is projected onto a surface, and electromagnetic radiation is emitted. The electromagnetic radiation is collectively emitted by an array defining a layer aligned parallel with the surface and overlapping at least a region of the surface onto which the graphical user interface is projected. Electromagnetic radiation is detected that reflects off of an object interrupting the defined layer where the defined layer overlaps the region of the surface onto which the graphical user interface is projected, and indicating a position of the object is output. | 02-18-2010 |
20100039380 | Movable Audio/Video Communication Interface System - A system that includes a desk top assembly of a display and sensors mounted on a robotic arm. The arm moves the assembly so that it remains within position and orientation tolerances relative to the user's head as the user looks around. Near-field speaker arrays supply audio and a microphone array senses a user's voice. Filters are applied to head motion to reduce latency for arm's tracking of the head. The system is full duplex with other systems allowing immersive collaboration. Lighting and sound generation take place close to the user's head. A haptic interface device allows the user to grab the display/sensor array and move it about. Motion acts as a planar selection device for 3D data. Planar force feedback allows a user to “feel” the data. Users see not only each other through display windows, but can also see the positions and orientations of each others' planar selections of shared 3D models or data. | 02-18-2010 |
20100045593 | DIRECTIONAL INPUT DEVICE - The object of the invention is to provide a directional input device that changes the distance between electrodes that are included in a capacitative element in correspondence with a sliding direction (that is, input direction) to which an input unit is made to slide, thereby changing the electrostatic capacity of the capacitative element. | 02-25-2010 |
20100045594 | SYSTEMS, METHODS, AND DEVICES FOR DYNAMIC MANAGEMENT OF DATA STREAMS UPDATING DISPLAYS - Presented herein are methods, systems, devices, and computer-readable media for systems for dynamic management of data streams updating displays. Some of the embodiments herein generally relate to presenting video image data on an array of tiled display units, thereby allowing the display of much larger images than can be shown on a single display. Each display unit can include a video image display, a communication mechanism, such as a network interface card or wireless interface card, and a video image controller, such as a graphics card. Attached to the tiled display may be one or more user computers or other sources of video image data. A workstation may also be coupled to the tiled display and to the user computers. Each of the user computers can display data or images on the tiled display simultaneously. Since the tiled display is made up of multiple display units, the images from a single user computer may be on multiple, separate individual display units. The images from multiple user computers could also be shown on the same display unit and they may even overlap. | 02-25-2010 |
20100053069 | MOBILE COMPUTING SYSTEM FACILITATING ADAPTIVE DISPLAY OF CONTENT AMONG A PLURALITY OF DISPLAY COMPONENTS INCLUDING AT LEAST ONE VIRTUAL IMAGE DISPLAY COMPONENT - Systems, devices, and/or methods that facilitate adaptive display of content among a plurality of display components including at least one virtual image display component are presented. The disclosed subject matter facilitates forming determinations or inferences based on various metrics for adaptively routing content to select display devices. The display interface component, at least in part, routes content selectively between a primary display component and a virtual image display (VID) component. This can better optimize the use of VID components to avoid overstimulation of a user while allowing access to the benefits of the VID component under predetermined conditions. | 03-04-2010 |
20100053070 | MULTI-DIMENSIONAL OPTICAL CONTROL DEVICE AND A CONTROLLING METHOD THEREOF - A multi-dimensional optical control device and a method thereof are provided. A movable light source can be moved due to an external action, and produce a light beam. A lens coupled to the light source is to focus the light beam. A sensor is used to sense a spot formed on the sensor by the focused light beam, and a data processing circuit coupled to the sensor is to obtain variations of position, shape and light intensity in respect to a reference spot. According to such variations of position, shape and light intensity, the data processing circuit performs a motion control of multiple dimensions | 03-04-2010 |
20100053071 | Display control of classified content based on flexible display containing electronic device conformation - A method includes, but is not limited to: obtaining first information associated with one or more conformations of one or more portions of one or more regions of a flexible display containing electronic device and controlling display of one or more portions of the flexible display containing electronic device regarding display of second information having one or more classifications in response to the first information associated with the one or more conformations of the one or more portions of the one or more regions of the flexible display containing electronic device. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure. | 03-04-2010 |
20100053072 | Application control based on flexible interface conformation sequence status - A method includes, but is not limited to: obtaining information associated with one or more changes in one or more sequences of two or more conformations of one or more portions of one or more regions of the flexible interface and coordinating the one or more changes in one or more sequences of two or more conformations of one or more portions of one or more regions of the flexible interface with one or more commands. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure. | 03-04-2010 |
20100053073 | Display control based on bendable display containing electronic device conformation sequence status - A method includes, but is not limited to: obtaining information associated with one or more sequences of two or more conformations of one or more portions of one or more regions of a bendable display containing electronic device and controlling display of one or more portions of the bendable display containing electronic device regarding display of second information in response to the information associated with the one or more sequences of two or more conformations of the one or more portions of the one or more regions of the bendable display containing electronic device. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure. | 03-04-2010 |
20100053074 | Display control based on bendable display containing electronic device conformation sequence status - A system includes, but is not limited to: one or more conformation sensor modules configured to direct obtaining information associated with one or more sequences of two or more conformations of one or more portions of one or more regions of a bendable display containing electronic device and one or more display control modules configured to direct controlling display of one or more portions of the bendable display containing electronic device regarding display of second information in response to the information associated with the one or more sequences of two or more conformations of the one or more portions of the one or more regions of the bendable display containing electronic device. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure. | 03-04-2010 |
20100053075 | Display control based on bendable interface containing electronic device conformation sequence status - A method includes, but is not limited to: obtaining and controlling display of one or more portions of the bendable interface containing electronic device regarding display of second information in response to the information associated with the one or more changes in one or more sequences of two or more conformations of the one or more portions of the one or more regions of the bendable interface containing electronic device. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure. | 03-04-2010 |
20100053076 | Display control based on bendable interface containing electronic device conformation sequence status - A system includes, but is not limited to: one and one or more display control modules configured to direct controlling display of one or more portions of the bendable interface containing electronic device regarding display of second information in response to the information associated with the one or more changes in one or more sequences of two or more conformations of the one or more portions of the one or more regions of the bendable interface containing electronic device. In addition to the foregoing, other related system/system aspects are described in the claims, drawings, and text forming a part of the present disclosure. | 03-04-2010 |
20100053077 | NOTEBOOK COMPUTER WITH FORCE FEEDBACK FOR GAMING - A notebook computer with force feedback for gaming is provided. A vibrating plate is disposed on a host, and a vibration generating device is disposed at the bottom of the vibrating plate to generate vibration when a computer game is executed. Thus, a user playing the computer game may feel the force feedback. Additionally, to prevent vibration from affecting the normal operation of the host, a damper is disposed between the host and the vibrating plate to prevent vibration from being transmitted to the host. | 03-04-2010 |
20100053078 | INPUT UNIT, MOVEMENT CONTROL SYSTEM AND MOVEMENT CONTROL METHOD USING THE SAME - Exemplary embodiments of the present invention relate to a movement control method and system of a terminal input unit having a plurality of protrusions. Exemplary embodiments of the present invention disclose a process and an apparatus for controlling each protrusion so that the input unit forms interface modes used for a function control according to a user function of the terminal. | 03-04-2010 |
20100053079 | REMOTE CONTROL SYSTEM INCLUDING A DISPLAY PANEL AND A TERMINAL FOR REMOTE-CONTROLLING THE DISPLAY PANEL, AND REMOTE CONTROL METHOD IN THE REMOTE CONTROL SYSTEM - Disclosed are a remote control method in a remote control system including a display panel, and a terminal for remote-controlling the display panel, and the remote control system. The remote control system includes a terminal for transmitting a light signal to a point on the screen of a display panel according to user input, and a display panel for executing an operation corresponding to the light signal received point. | 03-04-2010 |
20100060567 | CONTROLLING DEVICE OPERATION RELATIVE TO A SURFACE - Architecture for automatic switching between multiple modes in a handheld device such as a mouse based on the presence of specular light, or lack thereof. When applied to a presenter mouse, the architecture facilitates the automatic switching between mouse mode and presenter mode without manual intervention by the user. An optical approach is well suited since most optical systems include a light source, lenses, and light sensors to detect reflected light from the source (or lack thereof). The approach leverages the existing light source and lenses in a mouse to minimize incremental cost, yet provide a robust technique for detecting lift from the tracking surface thereby automatically switching between modes as the user moves the mouse on and off the tracking surface. A delay circuit and/or image comparison can also be provided that eliminates undesirable triggering to a different mode by preventing unintended switching between the multiple modes. | 03-11-2010 |
20100060568 | CURVED SURFACE INPUT DEVICE WITH NORMALIZED CAPACITIVE SENSING - A curved surface input device with normalized capacitive sensing is disclosed. The input device can normalize capacitive sensing through an overlay having a varying thickness, such as an overlay with a curved surface. The capacitive sensing normalization can be implemented in software, hardware or a combination of software and hardware. A software implementation for normalizing capacitive sensing can comprise adjusting the sensitivity of a sensing operation associated with different sensor elements of the input device. A hardware implementation for normalizing capacitive sensing can comprise adjusting a hardware configuration of the input device associated with one or more physical parameters that can influence the capacitive sensitivity of the sensor elements, such as an area of the sensor elements, a distance between the sensor elements and other conductive input device elements (such as a ground plane), and a dielectric constant associated with the overlay. | 03-11-2010 |
20100060569 | WIRELESS REMOTE CONTROL HAVING MOTION-BASED CONTROL FUNCTIONS AND METHOD OF MANUFACTURE THEREOF - A wireless remote control and a method of manufacturing the same. In one embodiment, the wireless remote control includes: ( | 03-11-2010 |
20100060570 | Control System for Navigating a Principal Dimension of a Data Space - Systems and methods are described for navigating through a data space. The navigating comprises detecting a gesture of a body from gesture data received via a detector. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting comprises identifying the gesture using the gesture data. The navigating comprises translating the gesture to a gesture signal, and navigating through the data space in response to the gesture signal. The data space is a data-representational space comprising a dataset represented in the physical space. | 03-11-2010 |
20100066662 | IMAGE DISPLAY DEVICE - An image display device makes it possible to easily display a stereoscopically two-dimensional image and to improve its direction effect. A display device includes a first display for displaying a first image on a first screen, an image transmission element that is set in a light path for a display light component of the first image and that transmits the display light component of the first image so that a real image of the first image is displayed on an image forming surface positioned at a space on a side opposite to the first screen as a stray image. Further, the display device includes a. controller for controlling the first display so that at least one icon out of a plurality of icons (A-H) is displayed as a stray image disposed along a virtual path with a predetermined shape set in a real space portion including the first mentioned space. | 03-18-2010 |
20100066663 | REMOTE CONTROL POINTING TECHNOLOGY - A system is disclosed comprising a pointing device ( | 03-18-2010 |
20100066664 | WRIST-WORN INPUT APPARATUS AND METHOD - Provided are a wrist-worn input apparatus and method. The apparatus and method can segment meaningful hand gestures using a vibration sound generated by a finger tapping and can issue commands corresponding to the recognized hand gestures, enabling the user to easily input commands to electronic devices. | 03-18-2010 |
20100066665 | Tablet Computer Equipped with Microphones - A tablet PC capable of providing continuous utilization of a sound signal collected from a microphone without requiring any user intervention when a use mode thereof has been changed from a PC use mode to a tablet use mode is disclosed. The tablet PC includes a set of microphones to form a microphone array. The tablet PC is able to operate in a sound emphasis mode wherein sound signals collected from the microphones are processed while forming an emphasis space, and to operate in a non-processing mode wherein the sound signals are processed without forming the emphasis space. When a user manipulates a chassis orientation of the tablet PC from a PC use mode to a tablet use mode, the tablet PC operates to process the emphasis space so that the sound signals collected by the microphones can be utilized in the tablet use mode. | 03-18-2010 |
20100066666 | DISPLAY APPARATUS AND IMAGE DISPLAY METHOD THEREOF - A display apparatus and an image display method thereof are provided. The display apparatus automatically displays images stored in a memory in a pre-set order, and displays the images in various sequences if a change in position of the display apparatus is detected by a motion sensor. Accordingly, the sequence in which the images are displayed is easily changed. | 03-18-2010 |
20100066667 | ORIENTING A DISPLAYED ELEMENT RELATIVE TO A USER - An element is initially displayed on an interactive touch-screen display device with an initial orientation relative to the interactive touch-screen display device. One or more images of a user of the interactive touch-screen display device are captured. The user is determined to be interacting with the element displayed on the interactive touch-screen display device. In addition, an orientation of the user relative to the interactive touch-screen display device is determined based on at least one captured image of the user of the interactive touch-screen display device. Thereafter, in response to determining that the user is interacting with the displayed element, the initial orientation of the displayed element relative to the interactive touch-screen display device is automatically adjusted based on the determined orientation of the user relative to the interactive touch-screen display device. | 03-18-2010 |
20100066668 | MECHANISM FOR DISPLAYING PAGINATED CONTENT ON ELECTRONIC DISPLAY DEVICES - A computing device is provided that includes a display comprising a plurality of discrete elements. A memory is used to store a data collection of paginated content. A processor of the computing device is configured to retrieve each of the pages from the memory. The processor signals the display to individually present each of the pages. A sensor device is coupled to the processor. The sensor device is deflectable to signal the processor a deflection value that causes the processor to sequentially present at least portions of multiple pages on the display. | 03-18-2010 |
20100073283 | CONTROLLER WITH USER-SELECTABLE DISCRETE BUTTON EMULATION - A user device with a position control device such as a thumbstick may be used to emulate discrete button presses via user selection of a mode switch on the device. | 03-25-2010 |
20100073284 | SYSTEM AND METHOD FOR ANALYZING MOVEMENTS OF AN ELECTRONIC DEVICE - The disclosure relates to a system and method for analyzing movements of a handheld electronic device. The system comprises: memory; a microprocessor; a first module to generate movement data responsive to movements of the device; a second module providing instructions to the microprocessor to map the movement data to a string representation relating to symbols in a spatial coordinate system associated with the device and store the string representation in the memory; and a third module. The third module provides instructions to the microprocessor to analyze data relating to the string representation against data relating to a gesture string representing a gesture related to a command for the device to determine if the gesture has been imparted on the device; and if the string representation sufficiently matches the gesture string, executes a command associated with the gesture on the device. | 03-25-2010 |
20100073285 | DISPLAY DEVICE FOR DISPLAYING A SYSTEM STATE - A display device for displaying a system state has an output unit for reproduction of information characteristic of the system state, a sensor for detecting a distance between a user of the display device and the output unit, and a control unit for processing the reproduction of the information in the output unit as a function of the detected distance. | 03-25-2010 |
20100079369 | Using Physical Objects in Conjunction with an Interactive Surface - An interaction management module (IMM) is described for allowing users to engage an interactive surface in a collaborative environment using various input devices, such as keyboard-type devices and mouse-type devices. The IMM displays digital objects on the interactive surface that are associated with the devices in various ways. The digital objects can include input display interfaces, cursors, soft-key input mechanisms, and so on. Further, the IMM provides a mechanism for establishing a frame of reference for governing the placement of each cursor on the interactive surface. Further, the IMM provides a mechanism for allowing users to make a digital copy of a physical article placed on the interactive surface. The IMM also provides a mechanism which duplicates actions taken on the digital copy with respect to the physical article, and vice versa. | 04-01-2010 |
20100079370 | Apparatus and method for providing interactive user interface that varies according to strength of blowing - An apparatus includes an interactive user interface that varies according to strength of blowing. A sensor detects blowing from a user and outputs a strength value of the detected blowing. A memory stores interactive image data capable of giving an effect of responding to user's blowing strength in real time. A controller loads interactive image data corresponding to the strength value of the blowing from the memory, and configures an interactive image that varies according to the strength value of the blowing, using the loaded interactive image data. A display displays the configured interactive image under control of the controller. | 04-01-2010 |
20100079371 | TERMINAL APPARATUS, DISPLAY CONTROL METHOD, AND DISPLAY CONTROL PROGRAM - A terminal apparatus includes a display displaying a plurality of display elements representing options on a display screen, an image-capturing section capturing an image of an operator who is viewing the display screen, a face position detector detecting the position of a facial image of the operator in a captured image, and a controller controlling the display to move the plurality of display elements in a predetermined direction on the display screen and to sequentially update and display the display elements when it is detected that the facial image of the operator in the captured image is outside of a predetermined range, and to stop the movement of the plurality of display elements when it is detected that the facial image falls within the predetermined range. | 04-01-2010 |
20100085300 | Bendable electronic interface external control system and method - A method includes, but is not limited to: obtaining first information regarding one or more positions of one or more portions of one or more regions of a bendable electronic interface and sending one or more application related information portions to the bendable electronic interface based upon the obtaining of the first information. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure. | 04-08-2010 |
20100085301 | Bendable electronic interface external control system and method - A system includes, but is not limited to: obtaining and one or more application information sending modules configured to direct sending one or more application related information portions to the bendable electronic interface based upon the obtaining of the first information. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure. | 04-08-2010 |
20100090945 | VIRTUAL INPUT SYSTEM AND METHOD - The invention provides a virtual input system. The virtual input system comprises a trajectory generating apparatus and a receiving apparatus. The receiving apparatus comprises a sensing module, a coding module, a database, and a comparing module. The sensing module is used for sensing a trajectory information of the trajectory generating apparatus. The coding module converts the trajectory information to a specific code series according to a coding rule. The database stores a plurality of reference code series and a plurality of reference symbols corresponding to the plurality of reference code series. The comparing module compares the specific code series with the plurality of reference code series to determine at least one candidate symbol from the plurality of reference symbols. | 04-15-2010 |
20100090946 | System and Method for Gesture Based Control System - The system provides a gestural interface to various visually presented elements, presented on a display screen or screens. A gestural vocabulary includes ‘instantaneous’ commands, in which forming one or both hands into the appropriate ‘pose’ results in an immediate, one-time action; and ‘spatial’ commands, in which the operator either refers directly to elements on the screen by way of literal ‘pointing’ gestures or performs navigational maneuvers by way of relative or “offset” gestures. The system contemplates the ability to identify the users hands in the form of a glove or gloves with certain indicia provided thereon, or any suitable means for providing recognizable indicia on a user's hands or body parts. A system of cameras can detect the position, orientation, and movement of the user's hands and translate that information into executable commands. | 04-15-2010 |
20100090947 | System and Method for Gesture Based Control System - The system provides a gestural interface to various visually presented elements, presented on a display screen or screens. A gestural vocabulary includes ‘instantaneous’ commands, in which forming one or both hands into the appropriate ‘pose’ results in an immediate, one-time action; and ‘spatial’ commands, in which the operator either refers directly to elements on the screen by way of literal ‘pointing’ gestures or performs navigational maneuvers by way of relative or “offset” gestures. The system contemplates the ability to identify the users hands in the form of a glove or gloves with certain indicia provided thereon, or any suitable means for providing recognizable indicia on a user's hands or body parts. A system of cameras can detect the position, orientation, and movement of the user's hands and translate that information into executable commands. | 04-15-2010 |
20100090948 | Apparatus, system, method, and program for processing information - An information processing apparatus includes a sensor for generating a sensor output signal responsive to the three-dimensional coordinate position of a detection target in a monitor space by detecting a capacitance in the monitor space, and outputting the sensor output signal, a position detector for detecting the three-dimensional coordinate position of the detection target in the monitor space from the sensor output signal of the sensor, a storage unit for storing coordinate information identifying a three-dimensional space region set in the monitor space, a determining unit for determining whether the three-dimensional coordinate position of the detection target in the monitor space is contained in the three-dimensional set space region, based on the three-dimensional coordinate position of the detection target detected by the position detector and the coordinate information stored on the storage, and an output unit for outputting determination results of the determining unit. | 04-15-2010 |
20100097309 | INFORMATION PROCESSING APPARATUS AND COMPUTER-READABLE RECORDING MEDIUM RECORDING INFORMATION PROCESSING PROGRAM - Motion information is obtained which is information about a motion applied to an input device housing itself including a pointing device of a plurality of input mean. Next, based on the motion information, a movement amount of the input device housing is calculated. Thereafter, it is determined whether or not the movement amount satisfies predetermined conditions. When the predetermined conditions are satisfied, a position is designated based on an output from the pointing device. | 04-22-2010 |
20100097310 | TERMINAL AND CONTROLLING METHOD THEREOF - A terminal having a video communication function includes camera configured to capture image data, a memory configured to stored the captured image data, a wireless communication unit configured to permit a video communication during which the image data is communicated to a correspondent terminal, a display configured to display a user interface during the video communication, and a controller configured to store in the memory, responsive to termination of the video communication with the correspondent terminal, any portions of configuration setting information for the user interface that has been modified during the video communication. | 04-22-2010 |
20100097311 | Reaction Apparatus, Fuel Cell System and Electronic Device - The invention relates to a reaction apparatus that efficiently heats a reaction portion, and a fuel cell system and an electronic device that include such a reaction apparatus. A reaction apparatus ( | 04-22-2010 |
20100097312 | SYSTEM AND METHOD FOR LIGHT CONTROL - A system for light control, comprising a light source ( | 04-22-2010 |
20100097313 | LID STRUCTURE, APPARATUS AND METHOD FOR DISPLAYING GRAPHICAL INFORMATION - A lid structure, apparatus and method for displaying graphical information uses beams of coherent light that are emitted in a scanning manner to project the beams of coherent light onto a display surface to form the graphical information on the display surface. | 04-22-2010 |
20100097314 | DRAWING CONTROL APPARATUS AND DRAWING CONTROL METHOD OF ELECTRONIC PAPER - When a user inputs an instruction to display a subsequent screen from an operation unit | 04-22-2010 |
20100103092 | Video-based handwritten character input apparatus and method thereof - A video-based character input apparatus includes an image capturing unit, an image processing unit, a one-dimensional feature coding unit, a character database, a character recognizing unit, a display unit, and a stroke feature database. The image capturing unit captures an image. The image processing unit filters a moving track of a fingertip in the picture by detecting a graphic difference, then detecting a skin color and picking out a moving track most corresponding to a point of the object. The one-dimensional feature coding unit takes a stroke with respect to the moving track and converts the stroke into a coding sequence in a one-dimensional string according to a time sequence. The character recognizing unit proceeds with character comparison between the coding sequence in a one-dimensional string and the character database to find out a character having the most similarity for display one the display unit. | 04-29-2010 |
20100103093 | INFORMATION INPUTTING DEVICE, INFORMATION OUTPUTTING DEVICE AND METHOD - Two bar support portions are provided on opposite sides of a floor mat sensor and a horizontal bar is fixed between these portions to define spaces below into which feet are to be inserted. Then, using the horizontal bar as a reference, an input area, such as “A” or “B”, can be accurately stepped on. A signal receiver reads and stores a first signal, and a signal determination unit determines whether the signal that was read was generated by stepping on area A or area B. Thereafter, when a signal is received indicating a data type was received first, the signal receiver reads the next input signal and the signal determination unit determines whether a signal indicating the data type was received. | 04-29-2010 |
20100103094 | Input Device, Input Control Method, Information Recording Medium, and Program - In an input device ( | 04-29-2010 |
20100103095 | INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, AND CONTROL METHOD - An input apparatus, a control apparatus, a control system, and a control method that are capable of making a movement of a pointer on a screen a natural movement that matches an intuition of a user are provided. An input apparatus includes a casing, an acceleration sensor, and an angular velocity sensor. The acceleration sensor detects an acceleration value of the casing in a first direction. The angular velocity sensor detects an angular velocity about an axis in a second direction different from the first direction. Instead of calculating a velocity value of the casing by simply integrating the detected acceleration value, the velocity value of the casing in the first direction is calculated based on the acceleration value and the angular velocity value that have been detected. As a result, a highly-accurate calculation of the velocity value of the casing becomes possible, and a movement of a pointer on a screen becomes a natural movement that matches a sense of a user based on a displacement corresponding to the velocity value. | 04-29-2010 |
20100109998 | System and method for sensing facial gesture - A method and system of sensing facial gestures are disclosed. The method of sensing facial gestures includes determining a basic database (DB) for sensing facial gestures of a user, estimating a head pose of the user, extracting a mesh model corresponding to the estimated head pose using at least one of a personalized DB and a general DB, and sensing the facial gestures using the extracted mesh model, and reflecting the sensed facial gestures to a virtual object. | 05-06-2010 |
20100109999 | HUMAN COMPUTER INTERACTION DEVICE, ELECTRONIC DEVICE AND HUMAN COMPUTER INTERACTION METHOD - A human computer interaction device includes a first input device and a second input device, and there are at least a part of the operation regions of the whole function or a certain function of the second input device locating in the key operation region of the first input device. The first input device could be a key-input device and the second input device could be a mouse simulation device. An electronic device and a human computer interaction method using the human computer interaction device are further provided. This invention can reduce the operating regions of both the key input device and the mouse function, and can realize the key input device and mouse function without moving the forearm, thus increasing the work efficiency. | 05-06-2010 |
20100110000 | VISUAL DISPLAY SYSTEM AND METHOD FOR DISPLAYING A VIDEO SIGNAL - A video display system (100) is provided which comprises a display panel (DP) for displaying a video signal (V), at least one lighting unit (LU) for providing a surround or ambient lighting (LS), a user interface (UI) for receiving external user calibration signals and a lighting control unit (LC) for controlling the color and/or luminance of the lighting unit (LU) in dependence on the calibration signals received by the user interface (UI). | 05-06-2010 |
20100110001 | INPUT DEVICE AND METHOD AND PROGRAM - An input device includes a detection unit, a first acquisition unit, a second acquisition unit, and a compensation unit. The detection unit is configured to detect an operation by a user for controlling an electronic device and output an operation signal corresponding to the operation. The first acquisition unit is configured to acquire the detected operation signal and a differential value of the operation signal. The second acquisition unit is configured to acquire a function defined by the differential value to compensate for a delay in response of the operation signal with respect to the operation by the user. The compensation unit is configured to compensate the operation signal with the acquired function. | 05-06-2010 |
20100117953 | Hand-worn interface device - A Hand-Worn Ambidextrous Interface Device for use with interfacing with a computer or similar device includes in some preferred embodiments a Housing, an Angled Face, a plurality of Switches and a Removable Three-Axis Joystick. In some preferred embodiments the Removable Three-Axis Joystick may control the cursor on the monitor or screen of the computer or related device to which it is communicating. | 05-13-2010 |
20100117954 | E-paper display control based on conformation sequence status - A method for one or more portions of one or more regions of an electronic paper assembly having one or more display layers includes, but is not limited to: obtaining first information regarding one or more positions of one or more portions of one or more regions of the electronic paper assembly and sending one or more electronic paper assembly physical status related information portions to the electronic paper assembly based upon the obtaining of the first information. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure. | 05-13-2010 |
20100117955 | E-paper display control based on conformation sequence status - A system for one or more portions of one or more regions of an electronic paper assembly having one or more display layers includes, but is not limited to:one or more position obtaining modules configured to direct obtaining first information regarding one or more positions of one or more portions of one or more regions of the electronic paper assembly and one or more physical status sending modules configured to direct sending one or more electronic paper assembly physical status related information portions to the electronic paper assembly based upon the obtaining of the first information. In addition to the foregoing, other related method/system aspects are described in the claims, drawings, and text forming a part of the present disclosure. | 05-13-2010 |
20100117956 | DISPLAY SYSTEM AND METHOD FOR CONTROLLING AN ON-SCREEN DISPLAY WITHIN A DISPLAY SCREEN - A display system is used for controlling an on-screen display within a display device. The on-screen display is a text information including a literal string formed by combining several numbers of letters. The display system includes a memory device for storing pieces of letter information, a controller to download a coding chart upon initialization of the display device. The coding chart includes string-forming codes and letter-forming codes. The coding chart further includes groups of string-forming codes for encoding different literal strings. The string-forming codes and the letter-forming codes correspond to the letter information in the memory. Upon receipt of an external command corresponding to a specific string-forming code, the controller fetches a letter-forming code from the coding chart based on the specific string-forming code and letter information from the memory device based on the respective letter-forming code, thereby encoding and displaying the literal string over the display screen. | 05-13-2010 |
20100117957 | Remote control apparatus for vehicle - A remote control apparatus mountable on a vehicle includes an operation input device operable to point a position in a predetermined operation area, a lighting device for illuminating the operation input device, and a controller for controlling a brightness of the lighting device. The remote control apparatus has a self-diagnostic mode for diagnosing a fault in the operation input device. The controller controls the brightness of the lighting device such that the lighting device produces a illumination pattern corresponding to the position pointed by the operation input device, when the self-diagnostic mode is set. | 05-13-2010 |
20100123655 | Optical trace detecting module - An optical trace detecting module is disposed in a computer input device. The computer input device is able to displace on a plane relatively, and has a light-pervious plate used for an object to contact with and move on a surface thereof. The optical trace detecting module includes a circuit board, a first projection set, a second projection set, and at least one optical path diverting element. An optical sensor is electrically disposed on the circuit board. The first projection set is opposite to the light-pervious plate. The second projection set is opposite to the plane. The optical path diverting element is disposed between the two projection sets and the optical sensor, so as to direct the sensing beams emitted by the two projection sets to reach the optical sensor, thereby generating corresponding control signals. | 05-20-2010 |
20100123656 | METHOD AND DEVICE FOR INPUTTING FORCE INTENSITY AND ROTATION INTENSITY BASED ON MOTION SENSING - Provided is an input device for operating in a three-dimensional space and inputting user instructions. The input device includes a first operation unit that calculates a first rotation angle in a coordinate system independent of the attitude of the device based on the output value of a first sensor, a second operation unit that calculates a second rotation angle in the coordinate system based on the output value of a second sensor, an attitude angle measuring unit that calculates the attitude angle of the input device by combining the first rotation angle and the second rotation angle, and an intensity calculation unit that calculates force intensity in the coordinate system using acceleration of the input device and the attitude angle of the input device obtained in the attitude measuring unit. | 05-20-2010 |
20100123657 | INFORMATION PROCESSING APPARATUS, PROCESSING METHOD THEREOF, AND COMPUTER-READABLE STORAGE MEDIUM - An information processing apparatus executes a variety of processing operations in accordance with a user's operation detected by an operation detection device. The apparatus generates a data set from a data group in accordance with a predetermined condition. The apparatus determines the content of processing corresponding to the motion detected by the operation detection device. The apparatus adjusts the data set by increasing or decreasing data included in the data set generated, based on the determined processing. | 05-20-2010 |
20100127967 | MOBILE USER INTERFACE WITH ENERGY HARVESTING - A mobile user interface with energy harvesting is presented. In one embodiment, the mobile user interface comprises a striker, a piezoelectronic element to generate electric energy in response to being struck by the striker under control of an elastic mechanism, and a transmitter coupled to use the electric energy to transmit a signal wirelessly. | 05-27-2010 |
20100127968 | MULTI-PROCESS INTERACTIVE SYSTEMS AND METHODS - A multi-process interactive system is described. The system includes numerous processes running on a processing device. The processes include separable program execution contexts of application programs, such that each application program comprises at least one process. The system translates events of each process into data capsules. A data capsule includes an application-independent representation of event data of an event and state information of the process originating the content of the data capsule. The system transfers the data messages into pools or repositories. Each process operates as a recognizing process, where the recognizing process recognizes in the pools data capsules comprising content that corresponds to an interactive function of the recognizing process and/or an identification of the recognizing process. The recognizing process retrieves recognized data capsules from the pools and executes processing appropriate to contents of the recognized data capsules. | 05-27-2010 |
20100127969 | Non-Contact Input Electronic Device and Method Thereof - A non-contact input electronic device and a method thereof are disclosed. The non-contact input electronic device includes a display module, sensors and a control module. A plurality of sensors are disposed around the display module, respectively, to sense a non-contact input motion. The control module is coupled to the sensors, and it senses a moving order of the non-contact input according to the sensors, generates an input command corresponding to the moving order and executes the input command. | 05-27-2010 |
20100127970 | Information processing system and information processing method - An information processing system includes: a sensor unit that detects the three-dimensional position of an object according to variations of electrostatic capacitances; and a control unit that performs display dependent on the detected three-dimensional position at a position on a display unit determined with positions in directions orthogonal to a direction of separation in which the object and the sensor unit are located at a distance from each other. | 05-27-2010 |
20100127971 | Methods of rendering graphical images - Methods for defining the complexity and priority of graphics rendering in mobile devices based upon various physical states and factors related to the mobile system including those measured and sensed by the mobile device, such as position, pointing direction and vibration rate, are disclosed and described. In particular, a handheld computing system having an image type user interface includes graphical images generated in response to the instantaneous position and orientation of the handheld device to improve the value of the presented image and overall speed of the system. | 05-27-2010 |
20100134408 | Fine-motor execution using repetitive force-feedback - An individual's fine-motor skills can be assessed using a force-feedback haptic unit that includes an end-effecter and programmable settings. To assess these skills, a tangible computer readable medium initializes the programmable settings with a set of initial settings. It then presents a 3-D representation of a character or characters to a user. The user in turn is prompted to mimic the character(s) on a work space. While the user is attempting to mimic the character(s) using the end-effecter on the work space, timed stroke data are collected from the force-feedback haptic unit. Using the timed stroke data, an analysis is then generated to determine the user's precision and accuracy of mimicking the character. | 06-03-2010 |
20100134409 | THREE-DIMENSIONAL USER INTERFACE - The instant invention provides an apparatus, method and program storage device enabling a three-dimensional user interface for the movement of objects rendered upon a display device in a more realistic and intuitive manner. A Z distance is set whereupon a user crossing the Z distance is enabled to select an object, i.e. pick it up. As the user breaks the Z distance again, the object selected will move with the user's hand. As the user breaks the Z distance once more, the object will be released, i.e. dropped into a new position. | 06-03-2010 |
20100134410 | IMAGE DISPLAY DEVICE - An image display device easily displays a stereoscopically two-dimensional image and improves its direction effect and interactivity. A display device is included of a display element for displaying an image on a screen, an image transmission element that is set in a light path for a display light component of the image and that transmits the display light component of the image so that a real image of the image is displayed on an image forming surface positioned at a space on a side opposite to the screen as a stray image. The display device includes a property specifying element for specifying a property of a detected object positioned in a real space portion including the space where the stray image is displayed and a control element for controlling the display element so the stray image changes into a form corresponding to the specified property of the object in advance. | 06-03-2010 |
20100134411 | Information processing apparatus and information processing method - An information processing apparatus is provided which includes an image send unit for displaying a 3D image of a remote operation device as a virtual remote controller, at least one imaging unit for taking an image of a user, a 3D image detection unit for detecting a user's motion based on a video taken by the imaging unit, an instruction detection unit for determining whether the user has pressed a predetermined operation button arranged on the virtual remote controller, based on a detection result by the 3D image detection unit and a position of the predetermined operation button arranged on the virtual remote controller displayed by the image send unit, and an instruction execution unit for performing a predetermined processing corresponding to the user-pressed operation button on the virtual remote controller based on a determination result by the instruction detection unit. | 06-03-2010 |
20100134412 | Information Processing Apparatus and Information Processing Method - An information processing apparatus is provided which includes a first display unit for displaying book data per page, a second display unit provided so as to be adjacent to the first display unit for displaying the book data per page, an axial part provided between the first and second display units, a first detecting unit for detecting rotational angular displacement of the first display unit around the axial part, a second detecting unit for detecting rotational angular displacement of the second display unit around the axial part, and a display page controller for displaying a page N displayed on the first display unit on the second display unit, or displaying the page N displayed on the second display unit on the first display unit, based on the rotational angular displacement of the first display unit and the rotational angular displacement of the second display unit. | 06-03-2010 |
20100141574 | METHOD AND APPARATUS FOR OPERATING MOBILE TERMINAL - A method for operating a mobile terminal is disclosed. The mobile terminal includes a pressure sensor and an orientation sensor. While a pressure event is detected by the pressure sensor, functions related to content classification, content storage, content display, and menu navigation can be executed in response to a direction event detected by the orientation sensor. Hence, the mobile terminal is capable of operating in a dynamic and a flexible manner. | 06-10-2010 |
20100141575 | VIDEO DISPLAY DEVICE - A video display device enabling first and second viewers to view two respective different screens and enabling the first viewer to easily view the content displayed to the second viewer in a two-screen display mode. The video display device ( | 06-10-2010 |
20100149090 | GESTURES, INTERACTIONS, AND COMMON GROUND IN A SURFACE COMPUTING ENVIRONMENT - Aspects relate to detecting gestures that relate to a desired action, wherein the detected gestures are common across users and/or devices within a surface computing environment. Inferred intentions and goals based on context, history, affordances, and objects are employed to interpret gestures. Where there is uncertainty in intention of the gestures for a single device or across multiple devices, independent or coordinated communication of uncertainty or engagement of users through signaling and/or information gathering can occur. | 06-17-2010 |
20100149091 | Media Action Script Acceleration Method - Exemplary apparatus, method, and system embodiments provide for accelerated hardware processing of an action script for a graphical image for visual display. An exemplary method comprises: converting a plurality of descriptive elements into a plurality of operational codes which at least partially control at least one processor circuit; and using at least one processor circuit, performing one or more operations corresponding to an operational code to generate pixel data for the graphical image. Another exemplary method for processing a data file which has not been fully compiled to a machine code and comprising interpretable descriptions of the graphical image in a non-pixel-bitmap form, comprises: separating the data file from other data; parsing and converting the data file to a plurality of hardware-level operational codes and corresponding data; and performing a plurality of operations in response to at least some hardware-level operational codes to generate pixel data for the graphical image. Exemplary embodiments also may be performed automatically by a system comprising one or more computing devices. | 06-17-2010 |
20100149092 | IDENTIFYING CONTACTS ON A TOUCH SURFACE - Apparatus and methods are disclosed for simultaneously tracking multiple finger and palm contacts as hands approach, touch and slide across a proximity-sensing, multi-touch surface. Identification and classification of intuitive hand configurations and motions enables unprecedented integration of typing, resting, pointing, scrolling, 3D manipulation and handwriting into a versatile, ergonomic computer input device. | 06-17-2010 |
20100149093 | VIRTUAL REALITY SYSTEM INCLUDING VIEWER RESPONSIVENESS TO SMART OBJECTS - Embodiments of the invention include a virtual reality system that includes an instrumented device used to present a virtual shopping environment to a simulation participant. The participant's interactions with the virtual shopping environment may be used to conduct market research into the consumer decision making process. The virtual shopping environment may include one or more smart objects configured to be responsive to participant interaction. The virtual shopping environment may recreate a real-world shopping environment. | 06-17-2010 |
20100149094 | Snow Globe Interface for Electronic Weather Report - A portable computing device for displaying weather information. The device includes a transceiver configured to send and receive weather information, a display controller configured to generate a weather scene display including the weather information based on a received shaking input provide to the portable computing device and a display configured to present the generated weather scene display. The display controller is configured to display flitter configured to obscure the generated weather scene display during the receiving of the weather information and/or updating of the generated weather scene display. | 06-17-2010 |
20100156781 | EYE GAZE CONTROL DURING AVATAR-BASED COMMUNICATION - An avatar image on a device display, such as a cell phone or laptop computer, maintains natural and realistic eye contact with a user (a human being) while the user is communicating with the other human being, whose avatar is displayed on the device. Thus, the user and the avatar have natural eye contact during the communication session (e.g., phone call). Modules within the device ensure that the avatar eyes do not maintain a fixed gaze or stare at the user constantly and that the avatar looks away and changes head and eye angles in a natural manner. An imager in the device captures images of the user and tracks the user's eyes. This data is inputted to an avatar display control module on the device which processes the data, factors in randomness variables, and creates control signals that are sent to the avatar image display component on the device. The control signals instruct how the avatar eyes should be positioned. | 06-24-2010 |
20100156782 | Hand Control Image For Replication - A system and method for a display in the gaze-forward position of a vehicle operator, showing hand position relative to various control inputs or hand positions relative to a control surface. The hand replication system includes a control input device, a display displaying an image of the control input device, an optical input device directed to capturing an actual image of the control input device and providing the actual image output to one of a display or a processor. The method of providing an image of an operators hand relative to a control panel within the line of sight of the vehicle operator includes the steps of providing a display, providing an optical input device, obtaining an actual image of the control panel with the optical input device, and providing the actual image of the control panel to the display. | 06-24-2010 |
20100156783 | WEARABLE DATA INPUT DEVICE - A wearable data input device. A garment to be worn on a user's hand includes at least one digit portion for receiving a corresponding digit of the user's hand; a palmar portion; and a dorsal portion. A plurality of contact sensors includes at least one sensor on each digit portion of the garment. Each contact sensor is configured to detect contact between a corresponding portion of the user's hand and another object, and to generate contact signals in accordance with the detected contact. A surface movement sensor detects 2-dimensional (2-D) movement of the user's hand across a surface, and generates corresponding 2-D movement signals in accordance with the detected movement. A consol is configured to receive signals from each sensor, and to selectively transmit the received signals to a computer. At least the consol is selectively removable from the garment. | 06-24-2010 |
20100164861 | IMAGE SYSTEM CAPABLE OF SWITCHING PROGRAMS CORRESPONDING TO A PLURALITY OF FRAMES PROJECTED FROM A MULTIPLE VIEW DISPLAY AND METHOD THEREOF - An image system includes a multiple view display for displaying a plurality of frames simultaneously. The plurality of frames only can be viewed at different visible ranges respectively. The image system further includes a executing program means for inputting an executing program command, and a control means coupled to the executing program means for executing a program corresponding to a frame of the plurality of frames according to the executing program command transmitted from the executing program means. | 07-01-2010 |
20100164862 | Visual and Physical Motion Sensing for Three-Dimensional Motion Capture - A system includes a visual data collector for collecting visual information from an image of one or more features of an object. The system also includes a physical data collector for collecting sensor information provided by at one or more sensors attached to the object. The system also includes a computer system that includes a motion data combiner for combining the visual information the sensor information. The motion data combiner is configured to determine the position of a representation of one or more of the feature in a virtual representation of the object from the combined visual information and sensor information. Various types of virtual representations may be provided from the combined information, for example, one or more poses (e.g., position and orientation) of the object may be represented. | 07-01-2010 |
20100164863 | Systems, Software, Apparatus and Methods for Managing Out-of -Home Displays - Systems, methods, apparatus, and software for monitoring and managing out-of-home (“OOH”) displays are provided. In one aspect, a system for monitoring an OOH display includes an OOH display device configure to display content to at least one subject. An interaction detector is configured to detect at least one interaction between the OOH display device and a subject, and provide data about such interaction. An input mechanism accepts input signals from the subject, and a display controller device accepts signals from the subject and OOH display device. A data processing and routing mechanism processes and exchanges the data. | 07-01-2010 |
20100164864 | DIRECTION CONTROLLING SYSTEM AND METHOD OF AN ELECTRONIC DEVICE - A direction controlling system and method of an electronic device provides a fingerprint identification device for a user to touch. The electronic device captures a fingerprint template image of a finger. When the finger moves on the fingerprint identification device, the electronic device captures a sequence of fingerprint images of the finger. Furthermore, the electronic device detects a directional movement of the fingerprint according to the sequence of fingerprint images and the fingerprint template image. A scroll bar of the electronic device is controlled to move according to the movement direction and a movement distance calculated by the electronic device. | 07-01-2010 |
20100164865 | INFORMATION PROVIDING SYSTEM AND INFORMATION PROVIDING METHOD - An information providing system includes a display unit installed in a table and having on the table, a display screen that displays an image of a product; a detecting unit that detects placement of an object at the position of the displayed image; and a display control unit that causes content related to the product to be displayed on the display screen when placement of the object at the position of the image is detected by the detecting unit. | 07-01-2010 |
20100171691 | VIEWING IMAGES WITH TILT CONTROL ON A HAND-HELD DEVICE - A user interface suitable for use in cellular phones and personal digital assistants (PDAs), PC Tablets, as well as laptops, PCs, office equipment, medical equipment, or any other hand-held electronic device, that allows control of the image on the device display by tilting the device to either change the view in perspective, change the magnification, or both, concurrently, by moving the device. Thus, the tilt of the device controls the angle of view of the image, and moving the device perpendicular to the screen controls the magnification. | 07-08-2010 |
20100171692 | Input device and display device - An input device and a display device are provided. The input device may receive object information, associated with an object displayed on a display device, from the display device, sense at least one motion of a user, analyze the at least one sensed motion of the user based on the object information, and transmit the analysis result of the analysis unit to the display device, and the display device may receive the analysis result and control the object based on the analysis result. | 07-08-2010 |
20100171693 | DISPLAY CONTROL DEVICE, DISPLAY CONTROL METHOD, AND PROGRAM - A display control device includes a cabinet, a detection unit configured to detect an orientation of the cabinet, a display screen, and a display control unit configured to control display of the display screen in accordance with the orientation of the cabinet detected by the detection unit. | 07-08-2010 |
20100177033 | BUTTON WITH EDGE MOUNTED PIVOT POINT - A device disclosed herein reduces inadvertent activation of buttons mounted on edges of an electronic device. The button moves around an axis of rotation proximate to, and parallel with, the adjacent edge of the device. Thus, inadvertent pressure on an edge of the button adjacent to the edge of the device will not inadvertently activate the button. Additionally, the presence of the axis of rotation proximate to the edge maintains a desired reveal. | 07-15-2010 |
20100177034 | PORTABLE STORAGE DEVICE HAVING USER INTERFACE AND METHOD OF CONTROLLING THE USER INTERFACE - A portable storage device and a method of controlling a user interface (UI) using the same. The method includes obtaining first UI information using a UI when the portable storage device is connected to a host, transmitting the obtained first UI information to an application of the host, and displaying second UI information provided by the application of the host using the UI. | 07-15-2010 |
20100177035 | Mobile Computing Device With A Virtual Keyboard - The subject matter disclosed herein provides methods and apparatus, including computer program products, for mobile computing. In one aspect there is provided a system. The system may include a processor configured to generate at least one image including a virtual keyboard and a display configured to project the at least one image received from the processor. The at least one image of the virtual keyboard may include an indication representative of a finger selecting a key of the virtual keyboard. Related systems, apparatus, methods, and/or articles are also described. | 07-15-2010 |
20100177036 | COVER FOR PORTABLE TERMINAL - A cover for a portable terminal having a fixing part for fixing the portable terminal and a folder rotating at the fixing part to open and close the portable terminal. The cover includes an input unit and/or a display unit, and a charging unit. The input unit inputs data to the portable terminal. The display unit displays data from the portable terminal. The input unit and display unit are constructed in the folder. The charging unit is constructed in at least one of the fixing part and folder, and supplies power to the portable terminal generated using a solar cell. | 07-15-2010 |
20100177037 | APPARATUS AND METHOD FOR MOTION DETECTION IN A PORTABLE TERMINAL - An apparatus and method for motion detection in a portable terminal preferably includes a state determination unit for receiving first sensing information for determining a motion of the portable terminal by a sensor unit. After determining the motion of the portable terminal, second sensing information is received for determining whether or not a normal motion applying a motion function has occurred. The portable terminal determines if the portable terminal is a normal motion associated with a motion-related function or an abnormal motion that is not associated with a motion-related function. | 07-15-2010 |
20100177038 | PORTABLE DEVICE - According to the portable device of an aspect of the present invention, the first enclosure and the second enclosure are brought into movable linkage between the first position where the silhouettes of the first enclosure and the second enclosure are overlapped and the second position where the second enclosure is moved in parallel from the first position. In addition, the first enclosure and the second enclosure are rotatably and movably linked between the first position and the third position where the second enclosure is rotatably moved from the first position at a predetermined angle. | 07-15-2010 |
20100182228 | DISPLAY CONTROLLING PROGRAM AND DISPLAY CONTROLLING APPARATUS - An information processing apparatus includes a computer. The computer subsequently images a user, makes an evaluation of first image data indicating an image obtained by subsequently imaging, and displays the evaluation result on an LCD by subsequently updating the same. | 07-22-2010 |
20100182229 | TERMINAL, DISPLAY APPARATUS, AND METHOD FOR TRANSMITTING AND RECEIVING DATA THEREOF - A terminal, a display apparatus, and a method of transceiving data are provided. The terminal transmits or receives data corresponding to a specific point to which light is emitted through a communication unit if a data transmission command or a data receiving command is input. Accordingly, the display apparatus and the terminal transceive data by directly communicating with each other. | 07-22-2010 |
20100182230 | TERMINAL, FUNCTION STARTING-UP METHOD AND PROGRAM FOR TERMINAL - A terminal includes a display unit and a character conversion unit that recognizes a function related to entered characters in a character acceptable state, converts the entered characters to a symbol to be displayed on the display unit for starting the recognized function, and outputs the symbol. The terminal further includes a control unit that starts the function corresponding to the symbol displayed on the display unit. | 07-22-2010 |
20100182231 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING PROGRAM - An information processing apparatus includes a sensor portion, a judgment portion, and an output portion. The sensor portion detects three-dimensional coordinates designated by a spatially-apart detection target object. The judgment portion determines an area designated in advance, that includes the three-dimensional coordinates detected by the sensor portion. The output portion outputs, based on a result of the judgment by the judgment portion, audio corresponding to audio information from a position corresponding to at least two-dimensional coordinates out of the three-dimensional coordinates. | 07-22-2010 |
20100188325 | IMAGE DISPLAY DEVICE AND IMAGE DISPLAY METHOD - An image display device adjusts display luminance in accordance with a surrounding illuminance. Since the display luminance considered to be appropriate differs depending on a user, the image display device includes a display unit operating according to a display luminance calculated based on the surrounding illuminance. For example, in the image display device, a user can set a range of the display luminance as a predetermined rule. The image display device provides a user interface enabling a user to easily input the predetermined rule or perform the setting input without being aware of it. That is, when the surrounding illuminance is above a predetermined illuminance, an input bar indicates only the range where the maximum value can be set as an input screen and when the surrounding illuminance is below the predetermined illuminance, the input bar indicates only the range where the minimum value can be set. | 07-29-2010 |
20100188326 | Ornamental thumb or finger ring with secured hidden contact interface input device - An ornamental thumb or finger secured contact interface input device includes a thumb or finger ring that has a rotatable stylus operatively attached to the ring. The stylus includes an elongated retractable interface contact member including a text tap portion for contacting an interface and entering data. The contact member is retractable into a rotabable housing in a hidden ornamental mode and is fully extendable in an interface engaging mode for entering input into an electronic interface device. The ring is non-continuous and includes an opening and is bendable so that may be appropriately sized to the thumb or finger of a user. The device can function as a stylus for inputting an interface in the interface engaging mode or be worn as an item of jewelry in the hidden ornamental mode. | 07-29-2010 |
20100188327 | ELECTRONIC DEVICE WITH HAPTIC FEEDBACK - Haptic feedback may be provided to a user of an electronic device, such as an electronic book reader device, to confirm receipt of user input or otherwise convey information to the user. The haptic feedback may be provided more quickly than a display update time of a display of the electronic device. Different patterns, durations, and/or intensities of haptic feedback may be used in response to different events. | 07-29-2010 |
20100188328 | ENVIRONMENTAL GESTURE RECOGNITION - A data-holding subsystem. The data-holding subsystem includes instructions stored thereon that when executed by a logic subsystem in communication with the data-holding subsystem: receive one or more signals, determine a sensor type for each signal of the one or more signals, identify a sensor type specific pattern corresponding to a motion gesture in at least one of the signals, and generate a gesture message based on the motion gesture. The gesture message may be usable by an operating system of a computing device that includes the data-holding subsystem to provide a system-wide function usable by one or more application programs of the computing device to provide an application specific function. | 07-29-2010 |
20100188329 | PUSH BUTTON SWITCH DEVICE WITH AN OLED DISPLAY - A push button switch device with an OLED display includes a support casing for a switch, an actuating member movably assembled in the casing so as to actuate the switch, and a display assembled at an operating end of the actuating member viewable from the outside. The display is programmable so as to display images and is formed by an assembly of organic light emitting diodes, or OLEDs. The device furthermore includes a local control unit with a memory storing predetermined images and/or portions of video. The local control unit includes a field-programmable gate array, or FPGA, connected to the memory and to the display so as to control the selective switching on/off of the OLEDs and to thus display in the display at least part of the images and/or portions of video according to a certain sequence. | 07-29-2010 |
20100188330 | INPUT DEVICE - The image sensor | 07-29-2010 |
20100188331 | METHODS AND APPARATUSES FOR OPERATING A PORTABLE DEVICE BASED ON AN ACCELEROMETER - Methods and apparatuses for operating a portable device based on an accelerometer are described. According to one embodiment of the invention, a movement of a portable device is detected using an accelerometer attached to the portable device. An orientation of the portable device after the movement is determined based on movement data provided by the accelerometer. It is determined whether the portable device is held by a user after the movement based on the movement data provided by the accelerometer. Locations of the hands of the user for holding the portable device are determined based on the orientation of the portable device. At least one interface that is not within the predicted locations of the hands of the user is activated. | 07-29-2010 |
20100194676 | KVM switch and computer readable medium - A KVM switch that is connected between servers, and at least one set of a keyboard, a mouse and a monitor, comprising: an acquiring portion that acquires information showing a screen resolution to which the monitor is capable of adapting, from the monitor, an analysis portion that analyzes a screen resolution of a video signal output from a corresponding server, based on a horizontal synchronizing signal and a vertical synchronizing signal received from each of the servers; a determination portion that determines whether the analyzed screen resolution exceeds the screen resolution shown by the acquired information; a conversion portion that, when the analyzed screen resolution exceeds the screen resolution shown by the acquired information, converts the analyzed screen resolution into the screen resolution shown by the acquired information; and an output portion that outputs the video signal having the converted screen resolution to the monitor. | 08-05-2010 |
20100194677 | MAPPING OF PHYSICAL CONTROLS FOR SURFACE COMPUTING - Physical controls on a physical controller device (PCD) are dynamically mapped to application controls for an application being executed on a computer having a touch-sensitive display surface. The computer identifies a PCD which has been placed by a user on the display surface and displays a mapping aura for the PCD. When the user touches an activate direct-touch button displayed within the mapping aura, the computer activates a mapping procedure for the PCD and displays a highlighted direct-touch button over each application control which is available to be mapped to the physical controls on the PCD. When the user selects a particular application control which is available to be mapped by touching the highlighted button residing over the control, the computer creates a dynamic mapping between the selected application control and a user-selected physical control on the PCD. | 08-05-2010 |
20100194678 | DIAGONAL MOVEMENT OF A TRACKBALL FOR OPTIMIZED NAVIGATION - A handheld communication device including a trackball-based cursor navigation tool includes a display screen, a plurality of trackball roll-direction detectors, each engaging the trackball and primarily actuated by one of a vertical, horizontal, and diagonal rotation of the trackball relative to the display screen. The roll-direction detectors can each have a roller in contact with the trackball and rotatable about a corresponding rotational axis and can each also have a sensor for sensing the rotation of the trackball and outputting a signal representative of the amount of rotation of the trackball. A microprocessor receives the signals output from the roll-direction detectors, processes the signals to determine whether primarily vertical, horizontal, or diagonal movement has been detected, and outputs corresponding control signals to a screen cursors control to affect the movement of a cursor on the display screen. | 08-05-2010 |
20100194679 | GESTURE RECOGNITION SYSTEM AND METHOD THEREOF - A gesture recognition system includes an image pick-up device, a processor, an operation engine, an optimal template selection means, and a display terminal. The image pick-up device is for capturing an image containing a natural gesture. The processor is for finding out a skin edge of a skin part from the image, and then classifying the skin edge into multiple edge parts at different angles. The operation engine has multiple parallel operation units and multiple gesture template libraries of different angle classes. These parallel operation units respectively find out gesture templates most resembling the edge parts in the gesture template libraries of different angle classes. The optimal template selection means selects an optimal gesture template from the resembling gesture templates found out by the parallel operation units. The display terminal is for displaying an image of the optimal gesture template. Thereby, marker-less and real-time gesture recognition is achieved. | 08-05-2010 |
20100194680 | INFORMATION PROCESSING APPARATUS - According to an aspect of the invention, an information processing apparatus includes: a main body comprising a top face; a display module configured to be connected to the main body and configured to be rotatable between a closed position and an open position, the display module covering the top face at the close position and exposing the top face at the open position; a wireless communication module in the main body; a plurality of input devices in the main body; a processing module configured to perform a process corresponding to a manipulation signal that is send from the input devices to the processing modules; an input device setting module configured to select one of the input devices from the input devices as a selected input device; and an input manipulation control module configured to control the processing module so as to prevent the processing module from performing a processing corresponding to a manipulation signal received by the selected input device while the wireless communication module performs a wireless communication. | 08-05-2010 |
20100194681 | MANIPULATION DEVICE FOR NAVIGATING VIRTUAL MICROSCOPY SLIDES/DIGITAL IMAGES AND METHODS RELATED THERETO - Featured is a manipulation device the ability to navigate virtual microscopy slides. The device includes an inverted light emitting diode (LED) reflecting light off a textured slide to a complimentary metal oxide semiconductor (CMOS) sensor that indicates the movement of the slide. The slide is freely moved by hand or traditional X-Y-mechanical stage on a raised platform akin to a slide stage. Finger touch controls are provided to zoom to higher or lower power images. The device plugs into a standard computer system by USB port running software to image a virtual microscope slide. Also featured are systems and methods related thereto. | 08-05-2010 |
20100194682 | METHOD FOR TAP DETECTION AND FOR INTERACTING WITH A HANDHELD ELECTRONIC DEVICE, AND A HANDHELD ELECTRONIC DEVICE CONFIGURED THEREFOR - A method for tap detection and for interacting with a handheld electronic device, and a handheld electronic device configured therefore are described. In accordance with one embodiment, there is provided a method for tap detection on a handheld electronic device, comprising: measuring acceleration using an accelerometer of the handheld electronic device; determining when measured acceleration exceeds an upper limit threshold and a lower limit threshold within a predetermined duration of each other; when the upper limit threshold and lower limit threshold have been exceeded, determining a rate of change of acceleration between the upper limit threshold and lower limit threshold and registering a tap input when the rate of change of acceleration exceeds a predetermined tap threshold. | 08-05-2010 |
20100194683 | MULTIPLE SCREEN DISPLAY DEVICE AND METHOD - Image browsing method and display device having a body with a plurality of display faces according to different planes, a plurality of display screens able to simultaneously display different digital images, the screens being respectively on different display faces of the body, image selection means for selecting a plurality of digital images in an image collection to be displayed on the screens; and motion sensors connected to the image selection means to trigger a display change, the display change comprising the replacement of the display of at least one image on at least one of the display screens by another image from the image collection, as a function of the device motion. | 08-05-2010 |
20100194684 | METHODS AND SYSTEMS FOR PROVIDING PROGRAMMABLE COMPUTERIZED INTERACTORS - A computerized interactor system uses physical, three-dimensional objects as metaphors for input of user intent to a computer system. When one or more interactors are engaged with a detection field, the detection field reads an identifier associated with the object and communicates the identifier to a computer system. The computer system determines the meaning of the interactor based upon its identifier and upon a semantic context in which the computer system is operating. | 08-05-2010 |
20100201614 | Method and Apparatus for Computer Interface - A physical user interface is provided for a microprocessor device that runs an operating system. The interface has an array of sensors located below a workspace having a housing. The workspace divided into regions that are visible to a user, each region signifying a command to or an action performed by the operating system. Counters are provided with stored information such that the counter can be interpreted by the physical user interface so as to indicate a location of the counter as well as the stored information. The information may be directly or indirectly indicative of a request for a URL. | 08-12-2010 |
20100201615 | Touch and Bump Input Control - A touch and motion sensitive input control configured to use a combination of touch sensor output and motion sensor output to determine if an input event has occurred at an input area. The touch and motion sensitive input control can detect a particular input event (e.g., a button press) when a touch sensor detects a touch at a particular input area at around the same time as a motion sensor detects a change in motion. Based on the amount and nature of the motion detected, this can indicate that a user intended to cause an input event other than one caused by a mere touching of the input area. | 08-12-2010 |
20100201616 | SYSTEMS AND METHODS FOR CONTROLLING A DIGITAL IMAGE PROCESSING APPARATUS - A digital image processing apparatus includes a sensing unit configured to sense a user's gesture to perform a specific function and generate a signal representing the user's gesture. The digital image processing apparatus also includes a digital signal processing unit which receives the signal representing the user's gesture and recognizes a plurality of discontinuous gestures as one gesture when a temporal proximity threshold between a plurality of discontinuous gestures is met. The one gesture may represent an input command from the user. | 08-12-2010 |
20100207869 | SECURE MAN-MACHINE INTERFACE FOR MANAGING GRAPHICAL OBJECTS ON A VIEWING SCREEN - The field of the invention is that of man-machine interface devices for viewing screen ( | 08-19-2010 |
20100207870 | DEVICE AND METHOD FOR INPUTTING SPECIAL SYMBOL IN APPARATUS HAVING TOUCH SCREEN - A device to input a special symbol in an apparatus having a touch screen includes a touch sensor to detect a user touch at a first position, a text input controller to switch from a general character input mode to a special symbol input mode if the user touch is maintained for at least a predetermined time, and a special symbol display unit to display a special symbol at the first position. A method for inputting a special symbol includes a touch detecting step of detecting the user touch at the first position, switching to the special symbol input mode if the user touch is maintained for at least a predetermined time, and displaying a special symbol at the first position in the special symbol input mode. | 08-19-2010 |
20100207871 | METHOD AND PORTABLE APPARATUS - A method for a portable apparatus includes detecting a movement of the portable apparatus, and determining that the movement is associated with a user input for retrieving a value of a status of the portable apparatus; determining a value of the status; and presenting the value to the user. Corresponding portable apparatuses and computer program product are also presented. | 08-19-2010 |
20100207872 | OPTICAL DISPLACEMENT DETECTING DEVICE AND OPERATING METHOD THEREOF - An operating method of an optical displacement detecting device includes the steps of: capturing a plurality of images; obtaining a displacement according to the images; obtaining a characteristic variation of the images; and suppressing the output of the displacement when the characteristic variation matches a predetermined rule. The present invention further provides an optical displacement detecting device. | 08-19-2010 |
20100207873 | METHOD, SYSTEM AND SERVER PLAYING MEDIA USING USER EQUIPMENT WITH MOTION SENSOR - Disclosed is a method for receiving playable media stored in a server by user equipment having a motion sensor and playing the media according to the motion of the user equipment. The method includes the steps of (a) designating a part of the media as a first receiving area by the server and transmitting it to the user equipment; (b) changing a playback area, in which the media are played, according to the motion of the user equipment and playing a part of the media; and (c) requesting by the user equipment that the server provide a second receiving area, which is a part of the media including the playback area, when the playback area is outside the first receiving area of the media, receiving the second receiving area, and playing the media. When media stored in the server are to be played initially, it is unnecessary to download the entire media, so that the user equipment can play the media more efficiently. | 08-19-2010 |
20100207874 | Interactive Display System With Collaborative Gesture Detection - An interactive content delivery system ( | 08-19-2010 |
20100207875 | COMMAND CONTROL SYSTEM AND METHOD THEREOF - The invention discloses a command control system including a light emitting unit, an image capturing unit, a storage unit, and a processing unit. The processing unit is coupled with the image capture unit and the storage unit. The light emitting unit emits light to form an illumination area. The image capture unit captures a plurality of pieces of image information in the illumination area. The storage unit stores different commands corresponding to the image information. The processing unit performs functions according to the commands corresponding to the image information. | 08-19-2010 |
20100207877 | Image Generation System - An image generation system for generating an image on a display screen. The image generation system includes an eye-tracking system capable of determining a user's eye orientation and outputting a signal indicative of same. The image generation system also includes a bio-feedback sensor capable of detecting activity of one or more physiological functions of the user and outputting a signal indicative of the level of activity. A processor is included and is adapted to receive and process the output signals from the eye-tracking system and bio-feedback sensor. The processor determines an image to be generated on the display screen indicative of the signals from the eye-tracking system and bio-feedback sensor. | 08-19-2010 |
20100207878 | ILLUMINATION DEVICE, METHOD FOR FABRICATING THE SAME, AND SYSTEM FOR DISPLAYING IMAGES UTILIZING THE SAME - The invention provides an illumination device, method for fabricating the same, and system for displaying images utilizing the same. The illumination device includes a substrate, a first electrode, an illumination layer, and a second electrode. The substrate has a plurality of illumination regions. The first electrode overlies the substrate and has a first bump disposed in a first illumination region of the plurality of the illumination regions. The illumination layer overlies the first electrode. The second electrode overlies the illumination layer. | 08-19-2010 |
20100207879 | Integrated Proximity Sensor and Light Sensor - Apparatuses and methods to sense proximity and to detect light. In one embodiment, an apparatus includes an emitter of electromagnetic radiation and a detector of electromagnetic radiation; the detector has a sensor to detect electromagnetic radiation from the emitter when sensing proximity, and to detect electromagnetic radiation from a source other than the emitter when sensing visible light. The emitter may be disabled at least temporarily to allow the detector to detect electromagnetic radiation from a source other than the emitter, such as ambient light. In one implementation, the ambient light is measured by measuring infrared wavelengths. Also, a fence having a non-IR transmissive material disposed between the emitter and the detector to remove electromagnetic radiation emitted by the emitter. Other apparatuses and methods and data processing systems and machine readable media are also described. | 08-19-2010 |
20100214211 | HANDHELD ELECTRONIC DEVICE HAVING GESTURE-BASED CONTROL AND A METHOD OF USING SAME - The present disclosure describes a handheld electronic device having a gesture-based control and a method of using the same. In one embodiment, there is provided a method of controlling a handheld electronic device, comprising: receiving a motion signal as input from a motion detection subsystem in response to a movement of the device; determining from the motion signal a cadence parameter associated with the movement of the electronic device; determining whether the cadence parameter is greater than or equal to a cadence reference level; performing a first command when the cadence parameter is greater than or equal to the cadence reference level; and performing a second command is performed when the cadence parameter is less than the cadence reference level. | 08-26-2010 |
20100214212 | Display Device, Controlling Method and Display System Thereof - A display device is provided. The display device includes a command processing unit, a command converting unit, a universal serial bus (USB) interface and a display module. The command processing unit processes a remote-control command from a remote-controller. The command converting unit generates a human interface device (HID) command corresponding to the remote-control command. The USB interface outputs the HID command to an external host which generates an image in response to the HID command. Then the display module displays the image. | 08-26-2010 |
20100220053 | INPUT APPARATUS FOR IN-VEHICLE DEVICES - An input apparatus for in-vehicle devices is easy to use and facilitates recognizing the position of fingertips. For this, the input apparatus includes a control unit including a recess allowing fingers to be inserted threreinto and having a control surface on an inner side wall thereof, a control switch disposed on the control surface, and a camera horizontally photographing the fingers inserted into the recess; and a display unit displaying an image of the fingers photographed by the camera to overlay on a control screen. | 09-02-2010 |
20100220054 | Wearable electrical apparatus - A wearable input device includes a pair of ring-shaped signal electrodes and a current sensor arranged in parallel in the direction of the axis of a finger. The current sensor is provided outside an area sandwiched between the signal electrodes. An alternating current signal is applied between the signal electrodes. When the top end of the finger with this device worn thereon is brought into contact with any other body site, a current flows through the current measure point of the current sensor. When the top end of the finger is not in contact with any other body site, no current flows through the measurement point of the current sensor. Based on the measured current, it is determined whether the finger is in contact with any other body site. A command is outputted to an external device according to the result of the determination. | 09-02-2010 |
20100220055 | MOBILE ANTENNA UNIT AND ACCOMPANYING COMMUNICATION APPARATUS - An antenna unit is provided with an inverted F-type antenna element provided with a feeding point and a ground connection point, and a non-feed antenna element configured so as to resonate with the inverted F-type antenna element through electrical coupling. In addition, the antenna unit may also be provided with a ground part which is grounded to the earth and connected to the ground connection point provided on one edge of the inverted F-type antenna element, and a resonance element, one edge of which is connected to the ground part, resonated by the non-feed antenna element through electrical coupling. | 09-02-2010 |
20100225576 | THREE-DIMENSIONAL INTERACTIVE SYSTEM AND METHOD - The present invention relates to a method for providing an intuitive interactive control object in stereoscope comprising the steps of: (a) providing a display capable of displaying in stereoscope; (b) providing a system capable of motion tracking; (c) tracking a visual signal motion performed by a user; (d) providing a stereoscopic image of a remote control, on said display in response to said signal performed by said user; (e) tracking user's motion aimed at interacting with said displayed stereoscopic image of said remote control; (f) analyzing said user's interactive motion; and (g) performing in accordance with said user's interactive motion. | 09-09-2010 |
20100225577 | Devices And Associated Hinge Mechanisms - A device comprising first and second housings arranged to swivel about a swivel axis, wherein the device is arranged such that relative turning of the first and second housings about the swivel axis in a first direction reveals a first device user operational area, and relative turning of the first and second housings in the second opposing direction reveals a second device user operational area. | 09-09-2010 |
20100225578 | Method for Switching Multi-Functional Modes of Flexible Panel and Calibrating the Same - A method for switching multi-functional modes and calibrating an electronic device and the electronic device using the method are disclosed. The method for switching multi-functional modes comprises: detecting at least one sensing device to identify a specific shape of a flexible panel of the electronic device; and matching the specific shape with the multi-functional modes according to a corresponding table so as to execute one specific functional mode. The corresponding table comprises a corresponding relationship between the specific shapes of the flexible panel and the specific functional modes, and a specific functional mode executed by the electronic device corresponds to the specific shape according to the corresponding relationship. | 09-09-2010 |
20100225579 | IMAGE SENSOR AND OPTICAL POINTING SYSTEM - Provided is an optical pointing system. The optical pointing system has an image sensing unit for receiving light and generating a plurality of first analog signals corresponding to a quantity of the received light in response to a reset signal and a pixel selection signal, at least one pixel for shutter control for receiving light, each of the pixel for shutter control generates a second analog signal corresponding to a quantity of the received light, a first comparator for comparing voltages of the plurality of first analog signals to generate a first comparison digital signal in response to a shutter control signal, an image processor for receiving the first comparison digital signal to calculate a movement value of the optical pointing system, and a shutter control unit for generating the shutter control signal based on the at least second analog signal. | 09-09-2010 |
20100231503 | CHARACTER INPUT SYSTEM, CHARACTER INPUT METHOD AND CHARACTER INPUT PROGRAM - The grouping unit assigns an m-digit value expressed by an n-ary notation to each input candidate character one-for-one to classify the respective input candidate characters into n groups on a basis of each of the m digits. The group displaying unit causes the display device to display, in the lump on a group basis, the input candidate characters classified on a digit basis. Among n selection keys corresponding to the respective groups, the input device has one key operated by a user to output information indicating which selection key is operated (information indicative of a selection key operated) to the character structuring unit. The character structuring unit determines a character according to information input from the input device. | 09-16-2010 |
20100231504 | HOTSPOTS FOR EYE TRACK CONTROL OF IMAGE MANIPULATION - The invention relates to a system for manipulating images or graphics where the manipulation to be performed are selected or activated using eye-tracking. A number of hotspots are arranged around and bordering a region of interest (ROI), which are each, associated with an image parameter adjustment (e.g. contrast or brightness). An eye-tracking device determines a fixation point of the user and the system automatically performs the associated adjustment whenever the fixation point falls within a hotspot. The close bordering of the hotspots around the ROI allows the user to keep focus on the ROI and only slightly shift his/her gaze to or towards a hotspot for adjusting an image parameter. Thereby, the adjustment can be controlled while still keeping the ROI in focus so that the effect of the adjustment can be followed simultaneously. | 09-16-2010 |
20100231505 | INPUT DEVICE USING SENSORS MOUNTED ON FINGER TIPS - In order to allow key input without operation of a keyboard or allow click and drag operations by finger tips of the hands, the present invention is primarily characterized in that a user wears a glove ( | 09-16-2010 |
20100231506 | CONTROL OF APPLIANCES, KITCHEN AND HOME - The disclosed invention is generally in the field of control of appliances in the home, and in their networking and connectivity also with audio systems and internet sources and the integration of these elements in a connected manner. Preferred apparatus generally employs a video projection system and one or more TV cameras. Embodiments of the invention may be used to enhance the social interaction and enjoyment of persons in the kitchen and reduce the work of food preparation. The invention may be used in many rooms of the house, and contribute to the well being of seniors and others living therein. | 09-16-2010 |
20100231507 | METHOD AND APPARATUS FOR PROVIDING CONTENT AND METHOD AND APPARATUS FOR DISPLAYING CONTENT - Provided are a method and apparatus for providing digital content and a method and apparatus for displaying digital content. In the displaying method, method of displaying content in a terminal, situational information including at least one of information regarding a user of the terminal and information regarding an external environment is collected, whether the collected situational information conforms to display conditions of the content is determined, and then, the content is selectively displayed based on a result of the determining. | 09-16-2010 |
20100231508 | Systems and Methods for Using Multiple Actuators to Realize Textures - Systems and methods for using multiple actuators to realize textures are disclosed. For example, one disclosed system includes, a system including: a first actuator configured to receive a first haptic signal and output a first haptic effect based at least in part on the first haptic signal; a second actuator configured to receive a second haptic signal and output a second haptic effect based at least in part on the second haptic signal; and a processor configured to: determine the first haptic effect and the second haptic effect, the first and second haptic effects configured when combined to output a texture; and transmit the first haptic signal to the first actuator and transmit the second haptic signal to the second actuator. | 09-16-2010 |
20100231509 | Sterile Networked Interface for Medical Systems - One embodiment of a sterile networked interface system is provided comprising a hand-held surgical tool and a data processing system. The surgical tool includes a sensor for sensing a physical variable related to the surgery, a wireless communication unit to transmit the physical variable to the data processing system, and a battery for powering the hand-held surgical tool. The surgical tool sends the physical variable and orientation information responsive to a touchless gesture control and predetermined orientation of the surgical tool. Other embodiments are disclosed. | 09-16-2010 |
20100231510 | DUAL FILM LIGHT GUIDE FOR ILLUMINATING DISPLAYS - A front light guide panel including a plurality of embedded surface features is provided. The front light panel is configured to deliver uniform illumination from an artificial light source disposed at one side of the font light panel to an array of display elements located behind the front light guide while allowing for the option of illumination from ambient lighting transmitted through the light guide panel. The surface embedded surface relief features create air pockets within the light guide panel. Light incident on the side surface of the light guide propagates though the light guide until it strikes an air/light material guide interface at one on the air pockets. The light is then turned by total internal reflection through a large angle such that it exits an output face disposed in front of the array of display elements. | 09-16-2010 |
20100238107 | Information Processing Apparatus, Information Processing Method, and Program - There is provided an information processing apparatus, including: a display panel for displaying at least one object in a selected state or in unselected state; a detection area setting unit for setting, per object on the display panel, a first detection area covering a display area of the object and a second detection area covering the first detection area and being larger than the first detection area; an operating tool detecting unit for detecting an operating tool which is in proximity to the display panel; and a state managing unit for changing the object into the selected state when the operating tool is detected within the first detection area of the object in the unselected state, and for changing the object into the unselected state when the operating tool is not detected within the second detection area of the object in the selected state. | 09-23-2010 |
20100238108 | LIGHT-TACTILITY CONVERSION SYSTEM, AND METHOD FOR PROVIDING TACTILE FEEDBACK - A light-tactility conversion system is provided which includes a light emitting device including an illumination unit capable of emitting light at a same time to a plurality of illumination areas in different illumination patterns, and an illumination control unit for controlling the illumination unit and making the illumination unit project an image, and also for controlling the illumination patterns in units of pixels of the projected image and making the illumination unit emit light to specific illumination areas in specific illumination patterns, and a vibration device including an illumination pattern detection unit for detecting an illumination pattern of light received from the light emitting device, and a vibration control unit for generating a vibration pattern corresponding to the illumination pattern detected by the illumination pattern detection unit and vibrating an oscillator in the vibration pattern. | 09-23-2010 |
20100238109 | USER INTERFACE FOR SET TOP BOX - A method for control comprises a set top box receiving coordinates from a touch sensing screen. The coordinates are interpreted for controlling the set top box, and in accordance with the interpreted coordinates an action is performed. A further method for control comprises a set top box receiving a signal representative of displacement. A control function is determined from the displacement representative signal and the control function is activated. In accordance with the control function a signal is formed for communication. | 09-23-2010 |
20100238110 | LOCOMOTION INTERFACE DEVICE FOR INVOLVING BIPEDAL MOVEMENT IN CONTROL OVER COMPUTER OR VIDEO MEDIA - A locomotion interface includes a first section for a user contact with lower extremities and a second section proximate to the first section. The second section includes a first action region for the user to contact and move a lower extremity over. The locomotion interface also includes a first plurality of sensors for detecting this motion in the vicinity of the first action region. The locomotion interface typically includes a second action region and a second plurality of sensors. During operation the user contacts and moves a lower extremity over the second action region. The second plurality of sensors is positioned to detect this motion in the vicinity of the second action region. | 09-23-2010 |
20100245230 | HANDWRITING RECOGNITION IN ELECTRONIC DEVICES - The present invention provides a method of inputting characters into a handheld device, comprising steps of: reading handwriting information; recognizing said handwriting information in one active recognition mode and at least one inactive recognition mode; displaying at least one character candidate obtained in said active recognition mode and at least one character candidate obtained in said at least one inactive recognition mode; and inputting into said handheld device a desired character candidate selected by a user among said character candidates being displayed. The present invention also provides a corresponding apparatus for inputting characters into a handheld device, and a related handheld device. A user no longer needs to designate handwriting recognition modes, and recognition accuracy is greatly improved. | 09-30-2010 |
20100245231 | INPUT DEVICE - Provided is an input device which has excellent operability by not requiring a finger to be released from a key during operation requiring the finger to shift and preventing the finger from touching a plurality of keys at the same time. A key ( | 09-30-2010 |
20100245232 | Handheld Computer Interface With Haptic Feedback - A handheld computer interface includes an enclosure, a mass coupled to the enclosure, and an actuator coupled to the mass to change a position of the mass relative to the enclosure. When the actuator receives a signal indicating a change in the center of mass of the interface, it changes the position of the mass. | 09-30-2010 |
20100245233 | MOVING AN OBJECT WITHIN A VIRTUAL ENVIRONMENT - A method of moving an object within a virtual environment, the method comprising: providing a reference line within the virtual environment; based on a position of the object relative to the reference line, determining a target position for the object within the virtual environment; and controlling the object so as to guide the object within the virtual environment towards the target position. | 09-30-2010 |
20100245234 | Portable Electronic Device with Low Dexterity Requirement Input Means - A handheld-size electronic device is provided with a display screen, or front housing or bezel surrounding the display screen, that is moveably mounted to the remainder of the electronic device. Detectors are provided for detecting movement of the display screen, or surrounding front housing part, and a microprocessor coupled to the detectors controls the electronic device based on that movement. This mode of user input allows users to control the electronic device with a quick hand motion and without requiring the level of focus that is typically required to operate the small buttons of a wireless device. This mode of user input can be use for various purposes including, for example, controlling digital music playback. | 09-30-2010 |
20100245235 | ELECTRONIC DEVICE WITH VIRTUAL KEYBOARD FUNCTION - An electronic device includes a body, a rotatable member, a projecting unit configured for projecting an image, a sensing unit and a processing unit. The body includes a connecting end, and a plurality of conductive contacts is formed at the connecting end. The rotatable member is rotatably connected to the connecting end. The rotating member includes a conductive ball configured for being electrically connected to one of the plurality of the conductive contacts, and the plurality of conductive contacts are formed along a displacement path of the conductive ball. The sensing unit is configured for sensing interactions at specific locations corresponding to the projected image. The processing unit stores a plurality of images, and is configured for controlling the projecting unit to project a corresponding image according to the conductive contact which is electrically connected to the conductive ball. | 09-30-2010 |
20100245236 | COMPUTER-READABLE STORAGE MEDIUM AND INFORMATION PROCESSING APPARATUS - When a gravity center position enters a right input area, a motion of a character in a virtual game space pushing down the right pedal of a bicycle is started, and the bicycle is accelerated. Further, a reference position is moved so as to gradually approach the gravity center position, and the boundary between a non-input area and a left input area is also moved so as to gradually approach the gravity center position in conjunction with the reference position. When the gravity center position enters the left input area, a motion of the character in the virtual game space pushing down the left pedal of the bicycle is started, and the bicycle is accelerated. Further, the reference position is moved so as to gradually approach the gravity center position, and the boundary between the non-input area and the right input area is also moved so as to gradually approach the gravity center position in conjunction with the reference position. | 09-30-2010 |
20100245237 | Virtual Reality Environment Generating Apparatus and Controller Apparatus - The present invention provides a virtual reality environment generating apparatus including a non-base type interface configured to allow a user to haptically touch virtual objects and game characters. The virtual reality environment generating apparatus includes a content creating device | 09-30-2010 |
20100245238 | INPUT DEVICE AND METHOD, INFORMATION PROCESSING DEVICE AND METHOD, INFORMATION PROCESSING SYSTEM, AND PROGRAM - An input device includes an operating unit that a user grasps and operates in a three-dimensional free space in order to remotely operate an information processing device; and a transmitting unit to transmit a signal for a first gesture in the free space of the operating unit to set a mode, and a signal for a second gesture in the free space of the operating unit which differs from the first gesture to execute processing in the mode set based on the first gesture. | 09-30-2010 |
20100245239 | PRESSURE SENSING CONTROLLER - Embodiments of a pressure sensing controller implement grip and pressure sensing, as well as standard input control actuation, to provide control input by a user. The disclosed grip and pressure sensing control can be implemented in hand-held game controllers, control devices for appliances, cellular telephones, and any other type of devices that require control input. In the case of an existing control device with predefined control output, user programming of input settings to define command extensions allows extended gripping and pressure control input to be combined within the capable existing control outputs of the device. | 09-30-2010 |
20100245240 | Electronic Device with a Display Unit Being Movable in Relation to a Base Unit - The invention relates to an electronic device and a method for controlling functionality of such an electronic device in relation to a display unit. The electronic device includes a base unit ( | 09-30-2010 |
20100245241 | Apparatus and method for controlling functions of mobile terminal - A portable terminal includes an apparatus for controlling an operation of a mobile terminal. More particularly, an apparatus and a method sets a menu to control a mobile terminal in a specific region of an image stored in advance. A relevant function is performed by selecting the specific region of the image in which the menu has been set in the mobile terminal, thereby performing the relevant function fast and conveniently without separately selecting a menu (entering a menu). The apparatus includes a controller. The controller selects a specific region from an output image, sets a menu for controlling a function of the mobile terminal at the specific region, and stores the image where the menu has been set. | 09-30-2010 |
20100253617 | Portable Electronic Apparatus and Control Method of Portable Electronic Apparatus - A portable electronic apparatus and a control method of the portable electronic apparatus with excellent operability capable of surely reflecting a user's intended input operation are provided. The portable electronic apparatus and the control method comprise: a first sensor group G | 10-07-2010 |
20100253618 | DEVICE AND METHOD FOR DISPLAYING AN IMAGE - A projector producing an imaginary input plane with high operability is provided. A projector according to an embodiment projects a VUI screen picture onto a desk, and projects a main projection screen picture to a wall. The projector includes a light receiving element. The light receiving element is arranged in a position where light emitted toward the desk (VUI screen picture) and reflected (or scattered) by an object near the desk enters. The projector calculates a position of the object based on light sensing timing by the light receiving element and light scan positions at various points in time. The projector changes a projected screen picture when it determines that object is simultaneously in contact with a plurality of portions of the VUI screen picture and at least one of contact positions moves. | 10-07-2010 |
20100259471 | CONTROL DEVICE, HEAD-MOUNT DISPLAY DEVICE, PROGRAM, AND CONTROL METHOD - It is possible to provide a technique for accurately performing an operation desired by a user. A user's head operation is identified according to information detected by a head motion detection unit. A process desired by the user is executed according to an angular velocity of the head motion. Moreover, the technique uses a control unit which can accurately execute an operation by the user's head operation without reflecting the return motion of the user's head in the process. The control unit executes a process for a start and an end of each process corresponding to the detected angular velocity according to a predetermined threshold value. | 10-14-2010 |
20100259472 | INPUT DEVICE - An input device ( | 10-14-2010 |
20100259473 | USER INTERFACE DEVICE, USER INTERFACE METHOD, AND RECORDING MEDIUM - A user interface device ( | 10-14-2010 |
20100259474 | ENHANCED HANDHELD SCREEN-SENSING POINTER - Enhanced handheld screen-sensing pointing, in which a handheld device captures a camera image of one or more fiducials rendered by a display device, and a position or an angle of the one or more fiducials in the captured camera image is determined. A position on the display device that the handheld device is aimed towards is determined based at least on the determined position or angle of the one or more fiducials in the camera image, and an application is controlled based on the determined position on the display device. | 10-14-2010 |
20100265169 | Helmet Comprising Visor Position Detection and Associated Helmet Position Detection - The general field of the invention is that of helmet position detection systems for aircraft including a helmet comprising a substantially spherical mobile visor that is able to occupy a position of use and a stowage position, the visor being arranged in front of the pilot's eyes in the position of use. The helmet according to the invention comprises detection means for detecting the position of the visor and the helmet position detection system comprises means making it possible to introduce an angular correction due to the mobile visor in the orientation measurements when the visor is detected in the position of use by the detection means. The detection means are preferably optical sensors. | 10-21-2010 |
20100265170 | INPUT DEVICE, INFORMATION TERMINAL PROVIDED WITH THE SAME AND INPUT METHOD - In an input device used in mobile apparatuses in which portability is considered important and in mobile apparatuses in which a display unit such as a display is considered important, even if an input unit of the apparatus is made small in size, the input device is configured so that an input can be carried out without requiring an operator for skill. The input device is provided with the input unit including a detecting unit that, when a part of a living body in contact with the input device is pushed, detects a force transmitted through the living body and outputs detection data, and an input information specifying module that, when receiving the detection data, refers to stored data at a database, specifies a position where the living body is pushed, and outputs data allotted to the position as input information of electronic data. | 10-21-2010 |
20100271295 | Force feedback system including multi-tasking graphical host environment and interface device - A force feedback system provides components for use in a force feedback system including a host computer and a force feedback interface device. An architecture for a host computer allows multi-tasking application programs to interface with the force feedback device without conflicts. One embodiment of a force feedback device provides both relative position reporting and absolute position reporting to allow great flexibility. A different device embodiment provides relative position reporting device allowing maximum compatibility with existing software. Information such as ballistic parameters and screen size sent from the host to the force feedback device allow accurate mouse positions and cursor positions to be determined in the force feedback environment. Force feedback effects and structures are further described, such as events and enclosures. | 10-28-2010 |
20100271296 | USER INTERFACE POWERED VIA AN INDUCTIVE COUPLING - An operating machine, such as a medical machine or a dialysis machine, includes a housing for operating components of the machine and a moveable user interface or display for viewing and entering information concerning operation of the machine. Signals concerning operating information are wirelessly transmitted between the machine and the display using one of several techniques. Power is also transmitted wirelessly from the operating machine to the screen, or from a separate power source to the display. The wireless signals may be transmitted via induction, radio, infrared or optical means. | 10-28-2010 |
20100271297 | NON-CONTACT TOUCHPAD APPARATUS AND METHOD FOR OPERATING THE SAME - A non-contact touchpad method uses an image sensor ( | 10-28-2010 |
20100271298 | HAPTIC AUTOMATED COMMUNICATION SYSTEM - A haptic communication system having a range of sensors embedded with an operator's attire. Data collected by the sensors is processed by a computing device local to the operator and is communicated via a haptic modality in real-time to other team members and robotic assets in the system. | 10-28-2010 |
20100271299 | SELECTIVE INPUT SYSTEM AND PROCESS BASED ON TRACKING OF MOTION PARAMETERS OF AN INPUT OBJECT - A selective input system and associated method is provided which tracks the motion of a pointing device over a region or area. The pointing device can be a touchpad, a mouse, a pen, or any device capable of providing two or three-dimensional location. The region or area is preferably augmented with a printed or actual keyboard/pad. Alternatively, a representation of the location of the pointing device over a virtual keyboard/pad can be dynamically shown on an associated display. The system identifies selections of items or characters by detecting parameters of motion of the pointing device, such as length of motion, a change in direction, a change in velocity, and or a lack of motion at locations that correspond to features on the keyboard/pad. The input system is preferably coupled to a text disambiguation system such as a T9® or Sloppytype™ system, to improve the accuracy and usability of the input system. | 10-28-2010 |
20100277411 | USER TRACKING FEEDBACK - Technology is presented for providing feedback to a user on an ability of an executing application to track user action for control of the executing application on a computer system. A capture system detects a user in a capture area. Factors in the capture area and the user's actions can adversely affect the ability of the application to determine if a user movement is a gesture which is a control or instruction to the application. One example of such factors is a user being out of the field of view of the capture system. Some other factor examples include lighting conditions and obstructions in the capture area. Responsive to a user tracking criteria not being satisfied, feedback is output to the user. In some embodiments, the feedback is provided within the context of an executing application. | 11-04-2010 |
20100277412 | Camera Based Sensing in Handheld, Mobile, Gaming, or Other Devices - Method and apparatus are disclosed to enable rapid TV camera and computer based sensing in many practical applications, including, but not limited to, handheld devices, cars, and video games. Several unique forms of social video games are disclosed. | 11-04-2010 |
20100283722 | ELECTRONIC APPARATUS INCLUDING A COORDINATE INPUT SURFACE AND METHOD FOR CONTROLLING SUCH AN ELECTRONIC APPARATUS - An electronic apparatus includes a coordinate input surface on which at least a finger of a user can be placed, a first position estimating unit and a second position obtaining unit. The first position estimating unit is for estimating the position, here referred to as first position, of at least one object placed on the coordinate input surface. The second position obtaining unit is for obtaining an estimation of the position, here referred to as second position, at which a user is looking on the coordinate input surface. The apparatus is configured to be controlled at least based on the combination of the estimated first position and the estimated second position. The invention also relates to a system including such an apparatus, a method for controlling such an apparatus, and a computer program therefor. | 11-11-2010 |
20100283723 | STORAGE MEDIUM STORING INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - A game apparatus includes a CPU, the CPU judges a motion of a player on the basis of a cycle of a load value input from a load controller, and selectively displays an animation of a player object according to the motion. Furthermore, the CPU controls a moving amount or a moving velocity of the player object on the basis of the load value input from the load controller, and controls a moving direction of the player object according to a barycentric position of the player detected by the load controller. | 11-11-2010 |
20100283724 | DAMPING DEVICE CAPABLE OF PROVIDING INCREASED STIFFNESS - Damping device to impose a reaction to the displacement of a manual operating device ( | 11-11-2010 |
20100283725 | MANIPULATING DEVICE AND PORTABLE ELECTRONIC APPARATUS - A manipulating device suitable for connecting to a body of a portable electronic apparatus is provided, wherein the body has a first connecting unit. The manipulating device includes a manipulating body, a pivotal shaft, a fastener and a second connecting unit. The manipulating body has a button unit including a plurality of keys. The pivotal shaft connects the manipulating body. The fastener connects the pivotal shaft and is pivoted on the manipulating body through the pivotal shaft and is connected to the body. The second connecting unit is disposed at the fastener. When the fastener is connected to the portable electronic apparatus, the second connecting unit connects the first connecting unit. | 11-11-2010 |
20100283726 | USER INTERFACES AND ASSOCIATED APPARATUS AND METHODS - Apparatus for a portable electronic device, the apparatus arranged to detect substantially simultaneous user input from device user interface elements arranged to detect opposing user input, and to provide signalling, upon the detection of said opposing user input, for transferring electronic information content associated with the electronic device to a remote apparatus which is associatable with the electronic device. Apparatus for a portable electronic device, the apparatus arranged to detect movement of the device, relative to a remote apparatus which is associatable with the electronic device, as user input from the device, and to provide signalling, upon the detection of said relative movement, for transferring electronic information content associated with the electronic device to the remote apparatus which is associatable with the electronic device. | 11-11-2010 |
20100283727 | SYSTEM AND METHOD FOR SHAPE DEFORMATION AND FORCE DISPLAY OF DEVICES - Various systems, devices, and methods for shape deformation of a haptic deformation display device are provided. For example, the haptic deformation display device may receive an input signal when the shape of the haptic deformation display device is in a first shape configuration. In response to the input signal, the haptic deformation display device may activate an actuator of the haptic deformation display device. The actuator may move a deformation component of the haptic deformation display device. The deformation component may at least partially defining a shape of the haptic deformation display device, thereby causing the shape of the haptic deformation display device to deform into a second shape configuration different from the first shape configuration. The second shape configuration may be substantially maintained. | 11-11-2010 |
20100283728 | OBJECT, METHOD AND SYSTEM FOR TRANSMITTING INFORMATION TO A USER - An object ( | 11-11-2010 |
20100289737 | PORTABLE ELECTRONIC APPARATUS, OPERATION DETECTING METHOD FOR THE PORTABLE ELECTRONIC APPARATUS, AND CONTROL METHOD FOR THE PORTABLE ELECTRONIC APPARATUS - A portable electronic apparatus includes; plural sensor elements L | 11-18-2010 |
20100289738 | Stone, Portable Hand Held Device for Inputting Characters Into a Computer, Cell Phone, or any Programmable Device - This is an input devices that allows any type of character to be quickly and easily entered into any computing device with one hand, while being held by the same hand. The characters are arranged in arrays that can be displayed in a format of rows and columns. The user maneuvers the cursor through the tables using wheel type directional switches that can index continuously if desired. Normally, the thumb moves one wheel, or a ball, which can move the cursor over both rows and columns, and the index finger moves the other wheel or ball which changes the characters in the rows or columns allowing access to an unlimited number of characters. When the user arrives at the desired character, a switch controlled by one of the other fingers on the same hand then enters that character into the device. Double clicking, or pressing the switch quickly twice can make necessary adjustments to the character as desired, such as capitalizing a letter, or adding an umlaut. This device can also be used for accessing window style icons allowing the user to navigate through the operating system. | 11-18-2010 |
20100289739 | STORAGE MEDIUM STORING INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - A game apparatus saves, in accordance with an instruction of a user, a photographing image taken by an inward camera or an outward camera, or a handwriting image of a handwritten note inputted using a touch panel. Moreover, in accordance with an instruction of the user, position information is saved together with the photographing image or the handwriting image. That is, a photographing place or a creation place of the handwritten note is registered on a map. When the photographing image or the handwriting image is reproduced, an image of a landmark set near a position on the map indicated by the position information is reproduced before the photographing image or the handwritten image. | 11-18-2010 |
20100295769 | Device for Controlling an External Unit - A device for controlling a click command controlled external unit including a portable head mounted frame, a click command detector mounted on the head mounted frame and adapted to sense tension changes of at least one muscle in the face of the user in order to detect when the user provide a click command, and a click command transmitter adapted to transmit information about detected click commands to the external unit. | 11-25-2010 |
20100295770 | CONTROL METHOD FOR CONTROLLING REMOTE COMPUTER - The present invention relates to a control method of efficiently controlling a computer at a remote place using a limited input/output device and a remote communication terminal with limited memory capacity even in a communication network environment where the data transmission rate is limited and the transmission cost is high. The control method in accordance with the present invention includes an input method optimized for a limited input device of a terminal, a screen display method optimized for a small screen of a terminal, and a screen data transmission function optimized for a communication network speed, a transmission cost, and a limited memory capacity of a terminal. | 11-25-2010 |
20100295771 | CONTROL OF DISPLAY OBJECTS - Disclosed herein are systems and methods for controlling display objects. Particularly, a body part of a user may move, and the movement detected by a capture device. The capture device may capture images or frames of the body part at different times. Based on the captured frames, velocities of the body part may be determined or at least estimated at the different times. A blend velocity for the body part may be determined based on the different velocities. Particularly, for example, the blend velocity may be an average of the velocities of the body part over a period of time. A display object may then be controlled or moved in accordance with the blend velocity. For example, an avatar's body part may be moved in the same direction as a recent captured frame of the user's body part, and at the blend velocity. | 11-25-2010 |
20100295772 | Electronic Device with Sensing Assembly and Method for Detecting Gestures of Geometric Shapes - A method for detecting a gesture in a geometric shape and controlling an electronic device includes providing a sensing assembly including at least one photoreceiver and a plurality of phototransmitters, wherein each phototransmitter emits infrared light away from the electronic device about a corresponding central transmission axis, wherein each central transmission axis is oriented in a different direction with respect to the others; and controlling the emission of infrared light by each of the phototransmitters during each of a plurality of time periods during movement of an external object in a geometric shape relative to the electronic device. For each of the plurality of phototransmitters and for each of the plurality of sequential time periods, a corresponding measured signal is generated which is indicative of a respective amount of infrared light which originated from that phototransmitter during that time period and was reflected by the external object prior to being received by the photoreceiver. The measured signals are evaluated over time to identify the geometric shape; and the electronic device is controlled in response to the identification of the geometric shape. | 11-25-2010 |
20100295773 | ELECTRONIC DEVICE WITH SENSING ASSEMBLY AND METHOD FOR INTERPRETING OFFSET GESTURES - A method for controlling an electronic device includes providing as part of the electronic device a display screen for displaying content and a sensing assembly including at least one photoreceiver and a plurality of phototransmitters, wherein each phototransmitter is positioned to emit infrared light away from the electronic device about a corresponding central transmission axis, wherein each central transmission axis is oriented in a different direction with respect to the others. Emission of infrared light by each of the phototransmitters is controlled during each of a plurality of time periods as an external object moves in a first specified pattern of movement and then moves in a second specified pattern of movement which is offset from a generally centered position with respect to the sensing assembly, and measured signals are generated. The measured signals are evaluated to identify the first specified pattern of movement of the object, to detect a reference offset location corresponding to an end of the first specified pattern of movement of the object, and to determine, for each of a group of time periods when the object is moving in the second specified pattern of movement, a corresponding location of the object during that time period. A centering operation is performed in response to the identification of the first specified pattern of movement, wherein the centering operation moves an indicator to an initial predetermined reference location on the display screen, wherein the predetermined reference location is then associated with the reference offset location; and sequential locations of the indicator on the display screen are controlled in accordance with the corresponding determined locations of the object relative to the reference offset location. | 11-25-2010 |
20100295774 | Method for Automatic Mapping of Eye Tracker Data to Hypermedia Content - A system for automatic mapping of eye-gaze data to hypermedia content utilizes high-level content-of-interest tags to identify regions of content-of-interest in hypermedia pages. User's computers are equipped with eye-gaze tracker equipment that is capable of determining the user's point-of-gaze on a displayed hypermedia page. A content tracker identifies the location of the content using the content-of-interest tags and a point-of-gaze to content-of-interest linker directly maps the user's point-of-gaze to the displayed content-of-interest. A visible-browser-identifier determines which browser window is being displayed and identifies which portions of the page are being displayed. Test data from plural users viewing test pages is collected, analyzed and reported. | 11-25-2010 |
20100295775 | INPUT SYSTEM AND METHOD FOR ELECTRONIC DEVICE - An input system and method for enhancing an input interface of an electronic device includes a stylus with a signal transmitting module that sends an interrupt signal to an interrupt module of the electronic device when triggered. The interrupt module identifies the interrupt signal and relays the interrupt signal accordingly in order to perform operations of the electronic device. | 11-25-2010 |
20100295776 | ePaper Stamp - A method and apparatus are provided for stamping a piece of ePaper. A grid is positioned within a selected distance to a first side of the piece of ePaper. A grounding pin conductively connects a conductive backing plate located on a second side of the piece of ePaper. The grounding pin completes a voltage path from the grid through the piece of ePaper to the conductive backing plate. A voltage is supplied to the grid and supplying the voltage to the grid changes the appearance of the piece of ePaper to form a stamped image. | 11-25-2010 |
20100302136 | Method and apparatus for displaying three-dimensional stereo images viewable from different angles - A method and apparatus for displaying three-dimensional stereo images using a screen that displays multiple images, each representing objects seen from a particular angle, and a mask placed in front of the screen, containing holes or transparent areas, that allows multiple images to be viewed simultaneously, but only one image from any given direction. | 12-02-2010 |
20100302137 | Touch Sensitive Display Apparatus using sensor input - Described herein is a system that includes a receiver component that receives gesture data from a sensor unit that is coupled to a body of a gloveless user, wherein the gesture data is indicative of a bodily gesture of the user, wherein the bodily gesture comprises movement pertaining to at least one limb of the gloveless user. The system further includes a location determiner component that determines location of the bodily gesture with respect to a touch-sensitive display apparatus. The system also includes a display component that causes the touch-sensitive display apparatus to display an image based at least in part upon the received gesture data and the determined location of the bodily gesture with respect to the touch-sensitive display apparatus. | 12-02-2010 |
20100302138 | METHODS AND SYSTEMS FOR DEFINING OR MODIFYING A VISUAL REPRESENTATION - A system may track a user's motions or gestures performed in a physical space and map them to a visual representation of the user. The user's gestures may be translated to a control in a system or application space, such as to open a file or to execute a punch in a punching game. Similarly, the user's gestures may be translated to a control in the system or application space for making modifications to a visual representation. A visual representation may be a display of a virtual object or a display that maps to a target in the physical space. In another example embodiment, the system may track the target in the physical space over time and apply modifications or updates to the visual representation based on the history data. | 12-02-2010 |
20100302139 | METHOD FOR USING ACCELEROMETER DETECTED IMAGINED KEY PRESS - A portable apparatus comprising input means arranged to receive user input comprising of one or more taps on the portable apparatus. The apparatus is able to detect the taps on the portable apparatus and to determine at least one location of the taps on the portable apparatus. The apparatus is further able to define an imaginary key on the apparatus on the determined location. | 12-02-2010 |
20100302140 | Operation Device - Provided is an operation device to be held by a user with one hand when used. The operation device includes: a recessed portion formed at a position at which at least one of a thumb and fingers is placed when the user holds the operation device; and a main button which is disposed at a bottom of the recessed portion and has a top surface adjacent to a rim portion forming a side surface of the recessed portion. | 12-02-2010 |
20100302141 | Display and Interaction Environment for Mobile Devices - A computing device attached to a full-sized display and one or more user input devices supports a mobile device mating environment. Software modules running on a mobile device may interface with custom firmware or software modules on the computing device to support using the display, keyboard, mouse, and other user input devices of the computing device. The display of the computing device may also be leveraged to display screens and notifications generated by the mobile device. The user input devices of the computing device may be utilized to simplify interaction between the user and the mobile device. This operation may be selected instead of, or in addition to, operating the computing device according to its traditional functions associated with a primary operating system and associated applications of the computing device. | 12-02-2010 |
20100302142 | SYSTEM AND METHOD FOR TRACKING AND ASSESSING MOVEMENT SKILLS IN MULTIDIMENSIONAL SPACE - Accurate simulation of sport to quantify and train performance constructs by employing sensing electronics for determining, in essentially real time, the player's three dimensional positional changes in three or more degrees of freedom (three dimensions); and computer controlled sport specific cuing that evokes or prompts sport specific responses from the player that are measured to provide meaningful indicia of performance. The sport specific cuing is characterized as a virtual opponent that is responsive to, and interactive with, the player in real time. The virtual opponent continually delivers and/or responds to stimuli to create realistic movement challenges for the player. | 12-02-2010 |
20100309113 | MOBILE VIRTUAL DESKTOP - A device includes a display screen a processor to render a display of a portion of a logical image on the screen. The logical image is larger than that which can be displayed on the display screen of the device. As the device is maneuvered through space, different portions of the logical image are displayed, and the user may interact with these portions as desired. | 12-09-2010 |
20100309114 | Data input device - A finger worn device is provided. The device includes individual ergonomic box elements, shaped to fit the individual fingers of the hand, and interactive surfaces on the sides of the box elements for tactile data input by the thumb. The unique locations of the interactive surfaces upon the box elements place these surfaces within the natural placement of the fingers and thumb. Thumb contacts made upon these surfaces are easy and without repeated visual confirmation of finger and thumb placement. The box elements move with their respective fingers and follow natural finger articulation. The device is well suited the operation of hand held devices and can be used as a remote control, cell phone, calculator or personal date assistant. | 12-09-2010 |
20100309115 | DRAWING ASSIST DEVICE, DRAWING ASSIST PROGRAM, AND DRAWING ASSIST METHOD - Provided is a drawing assist device and the like which assists a drawing operation while improving efficiency thereof. According to the drawing assist device of the present invention, whether or not a first factor of an element defined according to a positional trajectory of a pointer moved by an agent is adequate in design for a subject represented by the element is determined. The agent is notified if the determination result thereof is negative. According thereto, the agent can progress the drawing operation while confirming whether or not the subject drawn personally is adequate in design. | 12-09-2010 |
20100309116 | CHARACTER INPUT DEVICE - The present invention relates to a character input device. The character input device includes a base. An input unit is provided on the base and independently performs first directional input in which the input unit moves from a reference location to one of first direction indication locations arranged radially around the reference location and spaced apart from one another, and second directional input in which one of second direction indication locations arranged radially on the input unit and spaced apart from one another is selected. A first detection unit detects movement of the input unit. A second detection unit detects second directional input. A control unit extracts a first character assigned to a first direction indication location at which movement of the input unit is detected, or a second character assigned to a second direction indication location at which second directional input is detected. | 12-09-2010 |
20100309117 | INCLINATION CALCULATION APPARATUS AND INCLINATION CALCULATION PROGRAM, AND GAME APPARATUS AND GAME PROGRAM - An inclination calculation apparatus sequentially calculates an inclination of an input device operable in terms of a posture thereof. The input device includes acceleration detection means and imaging means. The inclination calculation apparatus sequentially calculates first inclination information representing an inclination of the input device from positions of two imaging targets in a taken image obtained by the imaging means. The inclination calculation apparatus also sequentially calculates second inclination information representing an inclination of the input device from an acceleration detected by the acceleration detection means. The inclination calculation apparatus calculates an inclination of the input device using the first inclination information and the second inclination information. | 12-09-2010 |
20100315327 | Apparatus, methods and computer readable storage mediums for providing a user interface - Apparatus including a support configured to support a portable device; and a display coupled to the support and configured to receive and display a projected image, the projected image being generated by the portable device. | 12-16-2010 |
20100315328 | INTEGRATED CONTROL SYSTEM WITH MULTIPLE MEDIA SOURCES AND CORRESPONDING DISPLAYS - Mechanisms are provided for efficiently manipulating devices such as computer systems, cameras, recorders, sensors, etc., referred to herein as media sources. The media sources are connected to a control computer over a network. Output such as video and other data output from the media sources are provided on a display system having multiple displays where each display corresponds to a particular media source. Input devices such as keyboards, mice, and touchpads may be used to operate the control computer and control media sources based on cursor position. | 12-16-2010 |
20100315329 | WEARABLE WORKSPACE - The present disclosure relates to a wearable workspace system which may include an input device and a head worn display coupled to a wearable computer. The input device may be configured to detect user input data wherein the user input data is provided by a user hands-free. The computer may be configured to: store an electronic technical manual, receive the detected user input data and generate an output based on the recognized user input data. The head worn display may be configured to display at least a portion of the electronic technical manual to the user while allowing the user to simultaneously maintain a view of a work piece. The display may be further configured to receive the output from the computer and to adjust the at least a portion of the electronic technical manual displayed to the user based on the output. | 12-16-2010 |
20100315330 | CONTROL APPARATUS FOR COMPLEX INPUT INTERFACE - The present disclosure relates to a user input apparatus for integrated processing of user inputs via various types of switches. In one aspect, the apparatus includes: an input port connected to either an electrical switch or a mechanical switch to receive a user input signal via the electrical switch or the mechanical switch; a switch selection unit configured to store an electrical input unit activating information in case the input port is connected to the electrical switch, and to store a mechanical input unit activating information in case the input port is connected to the mechanical switch; an electrical input unit coupled to receive an electrical input signal inputted via the electrical switch, in case the electrical input unit activating information is stored in the switch selection unit; and a mechanical input unit coupled to receive a mechanical input signal inputted via the mechanical switch, in case the mechanical input unit activating information is stored in the switch selection unit. | 12-16-2010 |
20100315331 | DISPLAY DEVICE AND DISPLAY CONTROL METHOD THEREOF - This present disclosure provides a display device and display control method thereof. The display method used in an interactive display system which is made up of several display devices, the method includes: establishing an interactive display system; performing interactive display in the interactive display system; deletion the interactive display system when the interactive display is ended. | 12-16-2010 |
20100321286 | MOTION SENSITIVE INPUT CONTROL - A motion sensitive input control configured to prevent unintended input caused by inadvertent movement of a computing device. In one embodiment, unintended input can be prevented by disregarding an input event if a change in motion of the computing device is detected simultaneously with or immediately prior to the detected input event. In another embodiment, unintended input can be prevented by reducing the sensitivity of an input device during a motion-based state associated with the computing device. In this manner, the likelihood of inadvertent motion of a computing device causing an unintended input event can be reduced. | 12-23-2010 |
20100321287 | INTERFACE UNIT FOR GAME MACHINE AND GAME MACHINE - An interface unit | 12-23-2010 |
20100321288 | POSITION INPUT DEVICE AND COMPUTER SYSTEM - A position input device is provided in which signals are transmitted from a position indicator, and signals transmitted from the position indicator are received by a position detector device. According to certain embodiments, an electrical double-layer capacitor, a charging circuit which charges the electrical double-layer capacitor, and a power transmission unit which relays and supplies to the charging circuit power supplied from a power supply unit external to the position indicator, are provided in the position indicator. In other embodiments the position input device has a built-in power supply unit, transmitting units, and a control unit for switching the transmitting units between energized and de-energized states. Also provided are position input systems and computer systems including the position input device, and methods of operating the position input device and the systems. | 12-23-2010 |
20100321289 | MOBILE DEVICE HAVING PROXIMITY SENSOR AND GESTURE BASED USER INTERFACE METHOD THEREOF - A mobile device has a proximity sensor and a user interface based on a user's gesture detected using the proximity sensor. The gesture-based user interface method includes enabling proximity sensing through the proximity sensor, detecting a specific gesture through the proximity sensing, analyzing a pattern of the specific gesture, and executing a particular function assigned to the pattern. | 12-23-2010 |
20100328200 | Device and related method for converting display screen into touch panel screen - A device for converting display screen into touch panel screen includes a projection screen; an image projection unit for projecting an image on the projection screen; a light pointer for emitting a light spot signal of a specific wavelength on the projection screen; an image detecting unit, which includes a fisheye lens for receiving the image on the projection screen to generate a fisheye distorted image; and an optical filter for filtering out the optical energy except the light spot signal of the specific wavelength; an image processing unit, coupled to the image detecting unit, for calculating a first position of the light spot on the projection screen according to the fisheye distorted image; and a message transmitting unit, coupled to the image processing unit, for outputting a touch panel signal according to the calculating result of the image processing unit. | 12-30-2010 |
20100328201 | Gesture Based User Interface Supporting Preexisting Symbols - A motion controlled handheld device includes a display having a viewable surface and operable to generate an image and a gesture database maintaining a plurality of gestures. Each gesture is defined by a motion of the device with respect to a first position of the device. The gestures comprise symbol gestures each corresponding to a character from a preexisting character set. The device includes an application database maintaining at least one application and a gesture mapping database comprising a gesture input map for the application. The gesture input map comprises mappings of the symbol gestures to corresponding inputs for the application. The device includes a motion detection module operable to detect motion of the handheld device within three dimensions and to identify components of the motion in relation to the viewable surface. The device also includes a control module operable to load the application, to track movement of the handheld device using the motion detection module, to compare the tracked movement against the symbol gestures to identify a matching symbol gesture, to identify, using the gesture input map, the corresponding input mapped to the matching symbol gesture, and to provide the corresponding input to the application. | 12-30-2010 |
20100328202 | INFORMATION DISPLAY DEVICE, INFORMATION DISPLAY METHOD, AND PROGRAM - An information display device, an information display method, and a program displays, when displaying definition information of an instance method selected by a user, definition information that corresponds to contents of processing of the instance method during runtime. The information display device displays definition information of an instance method selected by a user; wherein the instance method is included in a type that can be referenced during runtime by a variable that references a receiver object of the instance method. | 12-30-2010 |
20110001694 | OPERATION CONTROL APPARATUS, OPERATION CONTROL METHOD, AND COMPUTER PROGRAM - An operation control apparatus is provided which includes a detection unit for detecting contact of an operation tool with a display surface of a display unit, a contact determination unit for determining a contact state of the operation tool with the display surface based on the detection result by the detection unit, a contact area recognition unit for recognizing, in the case where it is determined by the contact determination unit that the operation tool is in contact with the display surface, a contact area where the operation tool is in contact with the display surface, and an operation determination unit for determining, from a plurality of operation processing associated with an act of the operation tool in contact with the display surface, an operation processing to be executed, based on a size of the contact area recognized by the contact area recognition unit. | 01-06-2011 |
20110001695 | WEARABLE TERMINAL DEVICE AND METHOD OF CONTROLLING THE WEARABLE TERMINAL DEVICE - A wearable terminal device includes: a head mounted display including a monitor display unit; a line-of-sight detecting unit for detecting a line-of-sight position of a wearer; a sound collecting unit configured to collect sound uttered by the wearer; a sound recognizing unit configured to recognize a sound command from the wearer on the basis of the collected sound; an operation unit configured to receive operation corresponding to the detected line-of-sight position or operation instructed by the recognized sound command; a setting unit configured to set an operation mode corresponding to work of the wearer out of plural operation modes; and a control unit configured to control, in the set operation mode, display of the monitor display unit corresponding to operation by the wearer. | 01-06-2011 |
20110001696 | MANIPULATING OBJECTS DISPLAYED ON A DISPLAY SCREEN - Embodiments of the present invention is directed toward determining a location where a pointing device is directed. In one embodiment, the method includes receiving a message at the computing device from the pointing device. Sensor data is extracted from the message, the sensor data comprising accelerometer data, gyroscope data, or a combination thereof. A position of the pointing device in three-dimensional space is identified. An orientation of the pointing device in three-dimensional space is identified using the sensor data. A location to which the pointing device is directed is determined by utilizing the identified position of the pointing device and the identified orientation of the pointing device, and an object on a display screen at the location where the pointing device is directed is altered. | 01-06-2011 |
20110006977 | SYSTEM AND METHOD FOR CONVERTING GESTURES INTO DIGITAL GRAFFITI - The subject disclosure provides a device, computer readable storage medium, and method for converting gestures undergone by a device into digital graffiti. The disclosure includes ascertaining an orientation of the device and a path traversed by the device. Gestures undergone by the device are identified as a function of the orientation and the path. Digital graffiti corresponding to the gestures are then superimposed onto a digital canvas. | 01-13-2011 |
20110006978 | IMAGE MANIPULATION BASED ON TRACKED EYE MOVEMENT - The disclosure relates to controlling and manipulating an image of an object on a display device based on tracked eye movements of an observer. When the object is displayed according to an initial view, the observer's eye movement is tracked and processed in order to determine the focus of the observer's attention or gaze on the image. Thereafter, the displayed image is modified to provide a better view of the part of the object in which the observer is most interested. This is accomplished by modifying at least one of the spatial positioning of the object within the viewing area, the angle of view of the object, and the viewing direction of the object. | 01-13-2011 |
20110006979 | SYSTEM FOR CONTROLLING BRIGHTNESS FLICKER OF PARALLAX BARRIER LCD THAT HAS WIDE VIEWING ANGLE AND METHOD THEREOF - A system for controlling brightness flicker of a parallax barrier LCD having a wide viewing angle capable of minimizing brightness flicker by adjusting a permittivity curve depending on different times into a predetermined waveform when split barriers are on/off by movement of a viewer's viewing angle, and a method thereof. A method of controlling brightness flicker of a parallax barrier LCD having a wide viewing angle for controlling brightness of a display providing a stereoscopic image by acquiring a real-time image of a viewer, recognizing an image of the viewer and extracting locations and coordinates of eyes of the viewer, and controlling turn-on/off of split barrier electrodes. | 01-13-2011 |
20110006980 | DATA INPUT DEVICE, DATA INPUT METHOD, DATA INPUT PROGRAM, AND RECORDING MEDIUM CONTAINING THE PROGRAM - A data input device that enables the user to execute a scrolling operation rapidly and infallibly without enduring any operational burden is provided. | 01-13-2011 |
20110012827 | Motion Mapping System - A motion mapping system includes a motion sensing device and a receiving device. The motion sensing device may include an accelerometer, a rotational sensor, a microcontroller, and an RF transmitter. The microcontroller may output processed motion data to the receiving device. The receiving device may include an RF receiver, a microprocessor, and a Universal Serial Bus interface for connection to a computer. The receiving device's microprocessor may output the processed motion data to motion mapping software. The motion mapping software may map the motion data to a corresponding predetermined input event defined by the motion mapping software and transmit a control signal back to the receiving device's microprocessor indicating the corresponding predetermined input event. Upon reception of the control signal from the mapping software, the receiving device's microprocessor may generate a hardware input event according to the control signal and transmits the generated hardware input event back to the computer. | 01-20-2011 |
20110012828 | PROTRUSION PATTERN FORMING DEVICE WITH DISPLAY FUNCTION - A protrusion pattern forming device with a display function includes a transparent elastic sheet having an internal layer including colored liquid, and an actuator including a plurality of actuator elements disposed along one surface of the elastic sheet, each of the actuator elements changing its own shape in response to an application of a voltage, thereby allowing the surface of the elastic sheet to protrude. A protrusion pattern is formed on the elastic sheet, and a dot pattern corresponding to the protrusion pattern is displayed on the elastic sheet by selectively driving the plurality of actuator elements. | 01-20-2011 |
20110018793 | Mobile Device Customizer - A method and system for customizing a mobile host device is disclosed. An accessory device for interfacing with and customizing a mobile host device includes a communication channel designed to establish a bi-directional communication link between the accessory device and the host device. The accessory device also includes a processor communicatively coupled to the communication channel. The processor is designed to execute a plurality of applications. In addition, the accessory device includes an input assembly communicatively coupled to the processor. The input assembly is designed to minimize a total number of input elements included in the input assembly. Further, at least a first input element being selectively mapped to one or more input functions of the host device based on a user selection. | 01-27-2011 |
20110018794 | METHOD AND APPARATUS FOR CONTROLLING MOBILE AND CONSUMER ELECTRONIC DEVICES - Various methods for controlling a device is disclosed including dynamically selecting a set of mappings defining how a gesture made by a movement of at least one wearable item will be interpreted as one or more commands; determining whether the gesture has a mapping in the set of mappings; and translating the gesture into a command for the device based on the determination. Interpreting movements of a wearable item as gestures associated with a command to control a controlled device is also disclosed that includes sensing a movement of the wearable item in context as being indicative of a gesture relating to the command based on the first context. A method for communicating control information by a wearable device is further disclosed including determining an agreed upon set of control gestures between first and second devices, wherein the control gestures are performable using the first device and are supportable by the second device; and participating in a control sequence to control the second device via a wireless transmission corresponding to at least one of the control gestures to be performed using the first device. | 01-27-2011 |
20110018795 | METHOD AND APPARATUS FOR CONTROLLING ELECTRONIC DEVICE USING USER INTERACTION - A method and an apparatus for controlling an electronic device according to a user interaction occurring in a space neighboring the electronic device. The method for controlling an electronic device using an input interaction includes: recognizing at least one interaction occurring in a space neighboring the electronic device; and controlling the electronic device corresponding to the at least one interaction. | 01-27-2011 |
20110018796 | ELECTRONIC DEVICE - In a car stereo system, when an object such as a hand comes close to a sensing range of a proximity sensor provided near an illumination button while a display section and lights of an operation section that are provided on a front face of the car stereo system are in a state of complete non-lighting, only the light of the illumination button is turned on; if the illumination button is operated within the predetermined period of time, then the display section and the lights of the operation section are turned on; and if the illumination button is not operated within the predetermined period of time, then the light of the illumination button is turned off so that the state of complete non-lighting is entered again. | 01-27-2011 |
20110025596 | STORAGE MEDIUM HAVING GAME PROGRAM STORED THEREIN, GAME APPARATUS, AND TILT ANGLE CORRECTION METHOD - An information processing device performs a game process based on a tilt angle of an input device that can be rotated to any tilt about a predetermined axis. First, a game apparatus calculates a tilt angle representing the tilt of the input device. Then, the game apparatus determines whether the calculated tilt angle has transitioned across the boundary between the upper limit value and the lower limit value of the tilt angle. If the tilt angle has transitioned across the boundary, the tilt angle to be used in a predetermined information process is corrected to a predetermined value that is on one side of the boundary on which the tilt angle was before crossing the boundary. | 02-03-2011 |
20110025597 | Digital display device and image arrangement method using the same - A digital display device that implements a digital photo frame function. The digital display device includes a user input unit to receive control signals corresponding to a user operation, the control signals to control a plurality of images to be continuously displayed, a display controller including an image arrangement unit, the display controller to output display signals by storing image files or processing the stored image files according to the control signals, a storage unit having image files stored therein and a display unit to display the plurality of images on a screen according to the display signals, the image arrangement unit to arrange the plurality of images in an arrangement order for display based on comparison results that compare data for pairs of images selected from the plurality of images upon the control signals being input. | 02-03-2011 |
20110025598 | Spatial, Multi-Modal Control Device For Use With Spatial Operating System - A system comprising an input device includes a detector coupled to a processor. The detector detects an orientation of the input device. The input device has multiple modal orientations corresponding to the orientation. The modal orientations correspond to multiple input modes of a gestural control system. The detector is coupled to the gestural control system and automatically controls selection of an input mode in response to the orientation. | 02-03-2011 |
20110025599 | DISPLAY APPARATUS AND METHOD FOR CONTROLLING DISPLAY APPARATUS - A display apparatus and a method for controlling the display apparatus are provided. If a display apparatus is in a power off state and an interface is connected to a cable, an output of a cable connection sensing signal is blocked. Accordingly, the display apparatus may be able to report its power off state to an external device. | 02-03-2011 |
20110025600 | METHOD OF INDICATING ADDITIONAL CHARACTER COMBINATION CHOICES ON A HANDHELD ELECTRONIC DEVICE AND ASSOCIATED APPARATUS - A method and associated apparatus for indicating additional character combination choices from a disambiguation function on a handheld electronic device. | 02-03-2011 |
20110025601 | Virtual Controller For Visual Displays - Virtual controllers for visual displays are described. In one implementation, a camera captures an image of hands against a background. The image is segmented into hand areas and background areas. Various hand and finger gestures isolate parts of the background into independent areas, which are then assigned control parameters for manipulating the visual display. Multiple control parameters can be associated with attributes of multiple independent areas formed by two hands, for advanced control including simultaneous functions of clicking, selecting, executing, horizontal movement, vertical movement, scrolling, dragging, rotational movement, zooming, maximizing, minimizing, executing file functions, and executing menu choices. | 02-03-2011 |
20110025602 | METHOD AND DEVICE FOR TACTILE PRESENTATION - A method for a tactile presentation of perceivable content. The method comprises receiving a data representing perceivable content, selecting a plurality of electric currents according to the data, each the electric current is associated with at least one of a plurality of regions of a solution having a plurality of macromolecules, and changing a level of acidity (pH) of at least one of the plurality of regions by applying a respective the electric current thereto. A proton concentration in the plurality of regions tactilely presents the perceivable content. | 02-03-2011 |
20110032181 | DISPLAY DEVICE AND CONTROL METHOD UTILIZING THE SAME - A display device controlled by an indicator device having a first light-spot is disclosed. The display device includes a first camera, a panel, and a processing unit. The first camera detects the first light-spot for generating a first detection signal. The panel displays a cursor. The processing unit controls the cursor according to the first detection signal. When the distance between the first camera and the first light-spot is a first length and the moving distance of the first light-spot is a first distance, the moving distance of the cursor is a second distance. When the distance between the first camera and the first light-spot is a second length and the moving distance of the first light-spot is the first distance, the moving distance of the cursor is the second distance. | 02-10-2011 |
20110032182 | PORTABLE TERMINAL HAVING PLURAL INPUT DEVICES AND METHOD FOR PROVIDING INTERACTION THEREOF - A portable terminal having plural input devices and a method for providing interaction thereof are provided. A method for providing interaction of a portable terminal having plural directional input devices, includes: receiving an interaction signal input from at least of the plural direction input devices; checking drive modes of the portable terminal; and executing a preset function by the drive modes corresponding to the interaction signal. The method may easily execute various functions through intuitive interaction using plural input devices thereby improving convenience for a user. | 02-10-2011 |
20110032183 | METHOD, SYSTEM, AND STORAGE MEDIUM FOR A COMIC BOOK READER PLATFORM - The invention pertains to a method, system, and storage medium for a comic book reader platform that retains the unique and highly sought after look and flow of traditional printed comic books in a mobile hand-held device. The invention is capable of displaying the comic book in a full page or cell-by-cell configuration depending on the orientation and/or desire of the particular user and leverages advanced features provided by today's mobile hand-held devices such as motion sensitivity, orientation changing, touch screens, etc. The invention also provides the ability to store multiple comic books and allows the user to switch between them with ease. | 02-10-2011 |
20110032184 | ORTHOPEDIC METHOD AND SYSTEM FOR MAPPING AN ANATOMICAL PIVOT POINT - A system and method of touchless interaction is provided for resolving a pivot point of an object where direct placement of a sensor at the pivot point is not practical. It applies to situations where the pivot point of a rigid object is inaccessible but remains stationary, while the other end is free to move and is accessible. The system maps the object's pivot point by way of an external sensor that detects constrained motion of the rigid object within a hemispherical banded boundary. It can also detect a geometric pattern and acceleration during the constrained motion to compensate for higher order rotations about the pivot point. Other embodiments are disclosed. | 02-10-2011 |
20110037690 | ELECTRONIC ALBUM - An electronic album for presenting media content comprises a cover member and a frame member. The cover member and the frame member are hingedly coupled to each other to facilitate opening and closing of the electronic album. The electronic album further comprises a data input port, a user interface unit, a processor, and a display screen. The data input port, the user interface unit and the display screen are integrated in the frame member and are electrically connected to the processor. The data input port is capable of receiving data associated with the media content. Further, the user interface unit is capable of receiving at least one user input thereon and the processor is configured to process the data based on the at least one user input. The display screen is configured to present the media content based on the processed data. | 02-17-2011 |
20110037691 | THREE-DIMENSIONAL OBJECT DISPLAY CONTROL SYSTEM AND METHOD THEREOF - Provided is a three-dimensional object display control system that enables a three-dimensional object displayed on a screen to be freely rotated by means of instinctive operation. A computer system ( | 02-17-2011 |
20110037692 | APPARATUS FOR DISPLAYING AN IMAGE AND SENSING AN OBJECT IMAGE, METHOD FOR CONTROLLING THE SAME, PROGRAM FOR CONTROLLING THE SAME, AND COMPUTER-READABLE STORAGE MEDIUM STORING THE PROGRAM - The present invention includes: a display/optical sensor section ( | 02-17-2011 |
20110043442 | Method and System for Displaying Images on Moveable Display Devices - A display system includes a display device having multiple possible poses, including a neutral pose. A physical constraint maintains the display device in the neutral pose absent an application of an external force. A sensor measures a magnitude and direction of a displacement of the display device to a displaced pose due to the application of the external force. Then, the rendering engine renders an image on the display device according to the magnitude and direction of the displacement even while the display device remains constant in the displaced pose. | 02-24-2011 |
20110043443 | Systems and methods for utilizing personalized motion control in virtual environment - Techniques for controlling motions using motion recognizers generated in advance by users are described. According to embodiment, the motion recognizers created by end users are utilized to control virtual objects displayed in a virtual environment. By manipulating one or more motion sensitive devices, end users could command what the objects to do in the virtual environment. Motion signals from each of the motion sensitive devices are recognized in accordance with the motion recognizers created in advance by the users. One or more of the motion signals are at the same time utilized to tune the motion recognizers or create additional motion recognizers. As a result, the motion recognizers are constantly updated to be more accommodating to the user(s) | 02-24-2011 |
20110043444 | PORTABLE ELECTRONIC DEVICE - An exemplary portable electronic device includes a main body, a cover movably connected with the main body, a piezoelectric sensing unit, and a signal processing unit. The piezoelectric sensing unit is partially positioned on the main body and the cover. The piezoelectric sensing unit generates elastic deformation due to the relative displacement of the cover and the main body to generate a corresponding command signal. The signal processing unit is electrically connected to the piezoelectric sensing unit, and capable of actuating different modes of the portable electronic device according to the command signal. | 02-24-2011 |
20110043445 | Handheld electronic device and method of controlling the handheld electronic device according to state thereof in a three-dimensional space - A method of controlling a handheld electronic device according to state thereof in a three-dimensional space, which is applicable to the handheld electronic device including a CPU, a displacement sensor, and a quadrant section lookup table. The displacement sensor detects variation of the state of the handheld electronic device in a three-dimensional space and generates a current state signal accordingly. The lookup table lists a plurality of quadrant sections of the three-dimensional space, wherein each quadrant section corresponds to a function program or control command. The CPU receives the current state signal and calculates a current space signal accordingly. After determining the quadrant section in the lookup table that corresponds to the current space signal, the CPU activates the function program or control command that corresponds to the quadrant section. Thus, a user can execute the desired function programs or control commands conveniently by moving the handheld electronic device single-handedly. | 02-24-2011 |
20110043446 | COMPUTER INPUT DEVICE - Computer input apparatus comprising: an image capture device; and a marker member comprising at least two reference indicia, at least a first reference indicium being arranged to emit or reflect light having a first spectral characteristic, and at least a second reference indicium being arranged to emit or reflect light having a second spectral characteristic different from the first spectral characteristic, the image capture device being arranged to distinguish light of said first spectral characteristic from light of said second spectral characteristic thereby to distinguish the at least a first reference indicium from the at least a second reference indicium, the apparatus being configured to capture an image of the at least two reference indicia and to determine by means of said image a position and orientation of the marker member with respect to a reference frame. | 02-24-2011 |
20110050562 | VISUALIZATION CONTROLS - Implementations of visualization controls are described. Some techniques described herein enable a user to interact with a geoscience object on display. In one possible embodiment, simultaneous movements of two or more user-controlled points are tracked using a motion monitoring system, such as cameras. The movement of the points is then interpreted as an interaction with the geoscience object and results in a corresponding alteration of the display of the geoscience object. One example interaction includes a compound manipulation (i.e. translation, rotation, scaling and/or skewing) of the geoscience object. | 03-03-2011 |
20110050563 | METHOD AND SYSTEM FOR A MOTION COMPENSATED INPUT DEVICE - A method and system for a motion compensated input device are provided. The motion compensated input device includes an input device configured to receive a physical input from a user and convert the physical input into a physical input signal representative of the physical input, a motion sensing device configured to sense acceleration forces of at least one of the input device and the user, the acceleration forces introducing an error into the physical input, and an input compensator configured to adjust the physical input signal using the acceleration forces to generate a compensated input signal representative of the physical input. | 03-03-2011 |
20110050564 | Dynamic Picture Frame in Electronic Handset - A portable electronic device configured to operate in a image presentation mode that presents a sequence of images on a display component. A controller is configured to determine a context of the portable electronic device and to vary a presentation time period of images in a subset of images in the sequence relative to the presentation time period of images not in the subset, wherein the images in the subset are associated with the context of the portable electronic device. | 03-03-2011 |
20110050565 | COMPUTER SYSTEM AND CONTROL METHOD THEREOF - A computer system and a method for controlling the same, even while the case of the electronic device is closed by making the device capable of using various functions by a simple and convenient input method. The computer system comprises at least one device unit; a housing in which the device unit(s) is provided; a motion sensing unit configured to sense a user motion taken at an outer side of the housing; and a controller configured to control the device unit(s) to perform an operation corresponding to the user motion sensed by the motion sensing unit. | 03-03-2011 |
20110057872 | PORTABLE ELECTRONIC DEVICE - A portable electronic device has a display part ( | 03-10-2011 |
20110057873 | FLEXIBLE ELECTRONIC DEVICE AND METHOD FOR THE CONTROL THEREOFF - The present invention relates to a flexible electronic device ( | 03-10-2011 |
20110057874 | Methods and Systems for Lingual Movement to Manipulate an Object - An intra-oral system is disclosed for assisting an individual in developing intra-oral muscle control and strength, and for facilitating typing of alphanumeric characters on a virtual keyboard. The system may also be used to enable an individual having limited use of the upper extremities to control an electrical apparatus such as a wheelchair, a bed or a light fixture. The intra-oral system includes a mouthpiece having a plurality of cells embedded therein. The cells are configured to receive pressure applied by the tongue of an individual. Movement of the tongue over and against the cells causes an object to be moved over a display. In one embodiment, the object is moved through an obstacle course or over a simulated track as part of a therapeutic regimen. In another embodiment, the object is moved over alphanumeric characters on a digital keyboard, and selected characters are typed by operation of the mouthpiece. In this manner, textual matter may be produced and stored by the user, and then sent via electronic means using a wired or wireless communication network. In yet another embodiment, a character or icon on the display is selected and activated to manipulate an electrical apparatus. A method for moving an electrical apparatus using a mouthpiece controlled through lingual movement is also provided. | 03-10-2011 |
20110057875 | Display control apparatus, display control method, and display control program - A display control apparatus includes a recognizing unit configured to recognize a position of an operator and a position of a hand or the like of the operator, a calculating unit configured to regard a position of the operator in a screen coordinate system set on a screen as an origin of an operator coordinate system and multiply a position of the hand or the like with respect to the origin of the operator coordinate system by a predetermined function, thereby calculating a position of display information corresponding to the hand or the like in the screen coordinate system, and a control unit configured to cause the display information to be displayed at the position in the screen coordinate system calculated by the calculating unit. | 03-10-2011 |
20110057876 | POINTING DEVICE - A pointing device includes a first ground potential electrode; a second electrode for applying a voltage; a third electrode for measuring an electrical potential; a printed circuit board on which the first through the third electrodes are provided; a location pointing driving body that is provided on the printed circuit board and that is configured with a conductive part and that contacts the first and second electrodes, and a spherical part; a slide member that is located to cover a top part of the location pointing driving body and that is configured to drive the location pointing driving body by being slidable within a plane parallel to the printed circuit board; and a pressing force restriction member that is configured to restrict pressing force from the spherical part to the printed circuit board by receiving force from the slide member in the pressing direction. | 03-10-2011 |
20110063205 | DISPLAY MAGNIFIER - Disclosed is a display element that includes a display screen and a transparent cover overlying the display screen. The output from the display screen is magnified to occupy at least some of the transparent cover. The display element includes an arrangement of diffractive and refractive elements disposed between the display screen and the transparent cover that provide significant magnification without adding significant thickness to the display element. In some embodiments, the diffractive and refractive elements are embodied in a number of optical microelements, in some cases at least one optical microelement per pixel of the display screen. Some embodiments include microlenses that collimate the light emitted by the display screen. The display element can be “anamorphotic,” that is, the magnification in one direction differs from that in another. Some embodiments provide at least two levels of magnification. Liquid matching switches can be used to control the level of magnification. | 03-17-2011 |
20110063206 | SYSTEM AND METHOD FOR GENERATING SCREEN POINTING INFORMATION IN A TELEVISION CONTROL DEVICE - A system and method, in a television control device, for generating screen pointing information, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims. | 03-17-2011 |
20110063207 | DISPLAY APPARATUS AND CONTROL METHOD THEREOF AND PROJECTION APPARATUS AND CONTROL METHOD THEREOF - A display apparatus including at least one sensor, a processing unit and a control unit is provided. The sensor senses a human body. The processing unit is coupled to the sensor, and captures and processes signals from the sensor to obtain a sensing data. The control unit is coupled to the processing unit, and analyzes and determines whether the human body is to enter into or to be distant from a sensing range of the sensor according to the sensing data. When the control unit determines that the human body is to be distant from the sensing range for a first predetermined time, the control unit turns off the display apparatus or makes the display apparatus get into a power-saving/sleeping mode. When the control unit determines that the human body is to enter into the sensing range for a second predetermined time, the control unit turns on the display apparatus. | 03-17-2011 |
20110063208 | METHOD AND SYSTEM FOR CONVEYING AN EMOTION - The present invention relates to a method for conveying an emotion to a person being exposed to multimedia information, such as a media clip, by way of tactile stimulation using a plurality of actuators arranged in a close vicinity of the person's body, the method comprising the step of providing tactile stimulation information for controlling the plurality of actuators, wherein the plurality of actuators are adapted to stimulate multiple body sites in a body region, the tactile stimulation information comprises a sequence of tactile stimulation patterns, wherein each tactile stimulation pattern controls the plurality of actuators in time and space to enable the tactile stimulation of the body region, and the tactile stimulation information is synchronized with the media clip. An advantage with the present invention is thus that emotions can be induced, or strengthened, at the right time (e.g. synchronized with a specific situation in the media clip). | 03-17-2011 |
20110069002 | OPTO-ELECTRONIC SYSTEM FOR CONTROLLING PRESENTATION PROGRAMS - An input device for controlling a presentation program that is being run on a remote computing device is provided. The input device includes a first optical sensor configured to be activated by exposure to a focused beam of light and a second optical sensor configured to be activated by exposure to a focused beam of light. An RF communication device wirelessly delivers the instructional signals to the remote computing device to advance or reverse the presentation program. | 03-24-2011 |
20110069003 | STORAGE MEDIUM STORING INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - A game apparatus includes a CPU, and the CPU judges a motion of a player on the basis of a cycle of a load value input from a load controller, and selectively displays an animation of a player object according to the motion in a case that the motion by the player is a large-flapping motion, the CPU changes an updating velocity of an animation frame according to an arm-raising motion and arm-lowering motion of the large flapping motion. On the other hand, in a case that the motion by the player is a small-flapping motion, the CPU changes the updating velocity of the animation frame according to only the arm-lowering motion of the small-flapping motion. Thus, the motion of the player and the animation are synchronized. | 03-24-2011 |
20110069004 | REMOTE CONTROLLER SUPPORTING SYSTEM AND METHOD FOR PROVIDING WEB SERVICE OPTIMIZED FOR REMOTE CONTROLLER - Provided is a remote controller supporting system and method that may provide a web service optimized for a remote controller of a user by displaying a webpage with different functions according to functions of each remote controller. The remote controller supporting system may include: a webpage storage unit to store a different webpage for each remote controller according to functions supportable by each remote controller; and a function controller to transmit remote controller group information containing remote controllers and webpage information corresponding to each of the remote controllers, and to transmit a webpage corresponding to a remote controller used in a system receiving the remote controller group information according to a request of the system. | 03-24-2011 |
20110069005 | DISPLAY APPARATUS - A display apparatus is provided. A frame is fixed and supported at the rear of a front panel forming the front portion of the display apparatus, and a separate bracket member is not mounted on the edges of the front panel. Thus, the front exterior of the display apparatus is neatly finished, and the display screen looks bigger than it actually is. | 03-24-2011 |
20110074665 | INFORMATION PROCESSING PROGRAM HAVING COMPUTER-READABLE STORAGE MEDIUM THEREIN AND INFORMATION PROCESSING APPARATUS - A game apparatus calculates a first evaluation value based on the difference between the time when the load value detected by a load controller becomes the maximum and the time when the velocity of the center of gravity, which represents the velocity of movement of the position of the center of gravity, becomes the maximum. The game apparatus calculates a second evaluation value based on the velocity of load, which represents the degree of increase in the load in a predetermined time period, and the velocity of the center of gravity. The game apparatus calculates a third evaluation value based on the path of the position of the center of gravity. The game apparatus calculates the amount of slice based on the first through third evaluation values. | 03-31-2011 |
20110074666 | REMOTE CONTROL SYSTEM FOR MULTI-SCREEN DISPLAY - A remote control system for a multi-screen display of an array of sub-displays tiled together contiguously includes a remote controller and a number of control boxes each of which is installed on a corresponding sub-display. The remote control is configured for receiving user input governing selection and control of the sub-displays, generating a control signal based upon the selection and the control of the sub-displays, and transmitting the control signal. Each control box is configured for receiving the control signal, identifying if the control signal is passed and transferred to a controller of corresponding sub-displays upon the selection of the sub-displays. | 03-31-2011 |
20110074667 | SPECIFIC USER FIELD ENTRY - There is disclosed a method for controlling a user input in a computer system including a display, the computer system being adapted to receive inputs from a plurality of user input devices under the control of a system input device, the method comprising: a. detecting selection, by the system input device, of a modifiable displayed item; b. receiving modification data from at least one of the plurality of user input devices; and c. modifying the modifiable displayed item with the received modification data. | 03-31-2011 |
20110074668 | CONTROL DEVICE - The present system relates to a device for imparting control to an application program. The device comprises a first sensor for carrying out a brainwave measurement when said first sensor is in contact with a user's head. The device further comprising a second sensor adapted to generate an output signal obtained from a measurement by the second sensor, when can for instance be a gyro sensor or a camera. The device is operable to use the output signal for imparting control to the application program in case the brainwave measurement falls outside a given interval of brainwave measurement values. The present system thus discloses a device that has an increased accuracy compared to the prior art devices. | 03-31-2011 |
20110074669 | Illuminating Controller having an Inertial Sensor for Communicating with a Gaming System - A controller for use in interfacing with a computer game. The controller includes a handle having at least one button, and a spherically shaped object connected to only one end of the handle. The spherically shaped object is defined from a translucent plastic material. Further included as part of the controller is an inertial sensor or accelerometer. The accelerometer may, as defined herein, be part of an inertial sensor. The controller also has an illuminating means defined within the spherically shaped object. A circuit is provided for interpreting input data from the at least one button and the inertial sensor, and is configured for communicating data wirelessly. The circuit is further configured to interface with the illuminating means to trigger illumination of the spherically shaped object to switch from an un-illuminated color to an illuminated color. | 03-31-2011 |
20110074670 | Providing Input and Output for a Mobile Device - Providing input and output for a mobile device may be provided. At a mobile device, input may be received from at least one of a plurality of remote input devices. The plurality of remote input devices may be remote from the mobile device. The mobile device may have at least one local input device. The at least one of the plurality of remote input devices may have a greater form factor than the local input device. Next, the received input may be processed. The mobile device may transmit the output to at least one of the plurality of remote output devices. The plurality of remote output devices may be remote from the mobile device. The mobile device may have at least one local output device. The at least one of the plurality of remote output devices may have a greater form factor than the local output device. | 03-31-2011 |
20110074671 | IMAGE DISPLAY APPARATUS AND CONTROL METHOD THEREOF, AND COMPUTER PROGRAM - This invention provides a display apparatus which mounts a tilt sensor, and can control whether or not the user makes an image feed operation based on the tilt. An image display apparatus includes a display unit which displays image data recorded in a recording medium, an instruction accepting unit which accepts an instruction to make the image feed operation according to the tilt of the image display apparatus from the user, a tilt detection unit which detects the tilt of the image display apparatus with respect to a predetermined direction, and a display control unit which controls the display unit to display and switch the image data in accordance with a change in tilt detected by the tilt detection unit, when the instruction accepting unit accepts the instruction and the tilt detection unit detects the change in tilt. | 03-31-2011 |
20110074672 | USER INTERFACE DEVICE AND METHOD FOR CONTROLLING A CONNECTED CONSUMER LOAD, AND LIGHT SYSTEM USING SUCH USER INTERFACE DEVICE - The invention relates to a user interface device for controlling an electrical consumer, in particular, a light system. Further, it relates to light system using such user interface device. Moreover it relates to a method for controlling such light system using a user interface device. To provide a user interface device, a light system and a method for controlling a consumer load providing feed forward or feed-back information facilitating an easy and intuitive use of the user interface device when controlling a light system, a user interface device for controlling a connected light system ( | 03-31-2011 |
20110074673 | METHOD, APPARATUS FOR SENSING MOVED TOUCH AND COMPUTER READABLE RECORD-MEDIUM ON WHICH PROGRAM FOR EXECUTING METHOD THEREOF - Provided is a moved touch sensing method. The moved touch sensing method according to an exemplary embodiment of the present invention includes: acquiring channel information corresponding a touch inputted into a contact region; converting each of first channel information and second channel information acquired as the touch moves into order information according to a predetermined order; and outputting a movement distance clock signal representing a movement distance of the touch on the basis of the converted order information. According to exemplary embodiments of the present invention, it is possible to reduce a burden of a control processor by outputting information regarding a movement distance and a movement direction of a moved touch by using two clock signals. | 03-31-2011 |
20110080335 | BEDROOM PROJECTOR WITH CLOCK AND AMBIENT LIGHT CONTROL FEATURES - The brightness of a bed-mounted projector that projects images on a wall or ceiling is established in proportion to the ambient light level in the room. A motion detector can be used to operate room lights in the room only at user-defined times. Keystone correction of the image can be made based on the projection angle as sensed by an orientation sensor. An input strip on the headboard centerline can be used as a control input device. | 04-07-2011 |
20110080336 | Human Tracking System - An image such as a depth image of a scene may be received, observed, or captured by a device. A grid of voxels may then be generated based on the depth image such that the depth image may be downsampled. A background included in the grid of voxels may also be removed to isolate one or more voxels associated with a foreground object such as a human target. A location or position of one or more extremities of the isolated human target may be determined and a model may be adjusted based on the location or position of the one or more extremities. | 04-07-2011 |
20110080337 | IMAGE DISPLAY DEVICE AND DISPLAY CONTROL METHOD THEREOF - Disclosed is an image display device capable of recognizing a hand of a user and predefining an operating region. Image recognition means recognizes the position of a hand of the user. Operating region setup means predefines the operating region, on an imaging region plane of imaging means and around a position on which the user's hand is projected, for the purpose of enabling the user to issue instructions to the image display device. When the position on which the user's hand is projected moves and comes close to the periphery of the operating region, the operating region setup means moves the operating region in the direction of the movement of the position on which the user's hand is projected. Further, the image recognition means recognizes a hand-waving motion of the user, whereas the operating region setup means sets the size of the operating region in accordance with the magnitude of the user's hand-waving motion. Consequently, the image display device provides increased ease of operation and makes it possible to define the operating region in accordance with a user's intention without imposing a significant processing load on itself. | 04-07-2011 |
20110080338 | A Display Device - The present invention relates to a display device comprising a battery ( | 04-07-2011 |
20110084897 | ELECTRONIC DEVICE - An electronic device is disclosed. The electronic device comprises a distance sensor for sensing a distance between the electronic device and a face of a user of the electronic device, an image sensor for providing an image of the face of the user, and a display for displaying text and/or graphical objects. The electronic device further comprises a control unit operatively connected to the display for controlling the displaying of text and/or a graphical object thereon, to the distance sensor for receiving distance data indicative of said distance, and to the image sensor for receiving image data representing said image. The control unit is adapted to control a font size of said text and/or a size of said graphical object based on the distance data and/or the image data. | 04-14-2011 |
20110084898 | MEMORY SHAPE ELEMENT FOR FLEXIBLE OLED DISPLAY SCREEN - An OLED display has one or more shape memory wires disposed along respective edges of the display and energizable under control of a processor to flatten the display from a rolled or folded configuration. | 04-14-2011 |
20110084899 | IMAGE DISPLAY APPARATUS AND METHOD FOR OPERATING THE SAME - An image display apparatus and a method for operating the same are disclosed. The image display apparatus includes a display, a user input interface for receiving a control signal from a remote controller and processing the received control signal, a network interface for transmitting or receiving data over a network, a controller for controlling a pointer on the display according to the control signal received from the remote controller, and a platform for controlling data transmission or reception over the network according to the control signal received from the remote controller. The platform includes an Operating System (OS) kernel and an application layer that runs on the OS kernel, and the application layer including an installable or deletable application downloaded over the network. | 04-14-2011 |
20110084900 | HANDHELD WIRELESS DISPLAY DEVICE HAVING HIGH-RESOLUTION DISPLAY SUITABLE FOR USE AS A MOBILE INTERNET DEVICE - A handheld wireless display device, having at least SVGA-type resolution, includes a wireless interface, such as BluetoothrM, WiFi™, Wimax™, cellular or satellite, to allow the device to utilize a number of different hosts, such as a cell phone, personal computer, media player. The display may be monocular or binocular. Input mechanisms, such as switches, scroll wheels, touch pads, allow selection and navigation of menus, playing media files, setting volume and screen brightnesdcontrast, activating host remote controls or performing other commands. The device may include MIM diodes, Hall effect sensors, or other position transducers and/or accelerometers to detect lateral movements along and rotational gestures around the X, Y and Z axes as gesture inputs and movement queues. These commands may change pages, scroll up, down or across an enlarged screen image, such as for web browsing. An embedded software driver permits replicating a high-resolution screen display from a host PC. | 04-14-2011 |
20110084901 | USER INTERFACE DEVICE FOR CONTROLLING A CONSUMER LOAD AND LIGHT SYSTEM USING SUCH USER INTERFACE DEVICE - The invention relates to a user interface device for controlling an electrical consumer, in particular, a light system ( | 04-14-2011 |
20110090144 | SYSTEM DELAY MITIGATION IN INTERACTIVE SYSTEMS - A method sends a signal to render visual information on a display, and receives a user response to the rendered visual information. The user response includes a first delay. The method also queries an electronic system for data indicating a second delay. The second delay is a portion of the first delay and attributable to the electronic system. The method further using the data indicating the second delay to compensate for electronic system delay during interactions with a user. | 04-21-2011 |
20110090145 | METHOD AND ELECTRONIC SYSTEM FOR MULTI-DIRECTIONAL INPUT - A multi-directional input method is applicable to an authentication process of electronic software run on an electronic system. According to the method, when the authentication process is initialized, an authentication module having a multi-directional input object and entry fields is loaded. The multi-directional input object has multiple input sides, which together constitute a polyhedron and each have multiple input units. The input units each corresponds to at least one input options can be selected via an input device of the electronic system to constitute a required authentication data for inputting to the entry field. The authentication module verifies the validity of the input authentication data, and a next function of the electronic system is provided for use when the input authentication data is verified as valid. The multi-directional input method prevents authentication data from being illegally recorded to thereby provide enhanced security protection. | 04-21-2011 |
20110090146 | POSITION DETECTOR AND POSITION INDICATOR - A position detector includes a position indicator including a power source and configured to intermittently transmit a position indication signal to a tablet at a predetermined timing; and the tablet configured to detect a position on its surface pointed to by the position indicator by receiving the position indication signal. The position indicator further includes an information storing section configured to store multiple types of information; a control signal receiving circuitry configured to receive a control signal transmitted from the tablet; an information selecting circuitry configured to select one type of information from among the multiple types of information stored in the information storing section in accordance with a content of the control signal; and an information transmitting circuitry configured to transmit the one type of information selected by the information selecting circuitry to the tablet. | 04-21-2011 |
20110095974 | DISPLAY DEVICE AND METHOD OF CONTROLLING DISPLAY DEVICE - A display device includes a flexible substrate, a display unit including a plurality of light-emitting elements arranged at the substrate and configured to display an image according to an image signal, a displacement sensor provided to a front surface or a back surface of the substrate and configured to detect a curved state of the substrate, and a pixel shift control unit configured to control pixel shifting of the image displayed in the display unit when a curve of the substrate is detected by the displacement sensor. | 04-28-2011 |
20110095975 | MOBILE TERMINAL - A mobile terminal includes a body including a flexible portion, a display unit provided to the body, a sensing unit provided to the body and generating an electric signal in response to bending of the body, and a controller recognizing the electric signal and controlling the display unit according to the electric signal generated by the bending of the body. | 04-28-2011 |
20110095976 | MOBILE TERMINAL - A mobile terminal includes a connector to moveably connect a first body to a second body and a controller to change information displayed on a screen based on movement of the second body relative to the first body. The connector allows the second body to move along a first axis relative to the first body and along a second axis that crosses the first axis relative to the first body. | 04-28-2011 |
20110102314 | Dual-screen electronic reader with tilt detection for page navigation - A dual screen electronic reader and method for navigating a document are disclosed. The electronic reader includes two display screens which can be angled to each other, like an open book, for viewing by a person reading the document. The electronic reader includes a tilt detection system for detecting tilting of the electronic reader indicative that the reader has completed reading the page on the first screen and has pivoted the electronic reader to view the opposite screen. This causes the electronic reader to load a fresh page on the first screen, optionally after a short time delay, which allows for counter-rotational tilting to be taken into consideration. | 05-05-2011 |
20110102315 | REMOTE CONTROL POINTING TECHNOLOGY - A pointing device ( | 05-05-2011 |
20110102316 | Extensible User Interface For Digital Display Devices - In one embodiment, a system to manage video content comprises an index file management module comprising logic to generate an index file to describe content in an associated video file, store the index file for a video file in a first memory location, separate from a second memory location in which the video file is stored, receive, from a requesting entity, a request for access to the index file, in response to the request, download the index file to the requesting entity, and download the video file to the requesting entity. | 05-05-2011 |
20110102317 | SEMICONDUCTOR INTEGRATED CIRCUIT DEVICE, FACILITY APPLIANCE CONTROL DEVICE, AND APPLIANCE STATE DISPLAY APPARATUS - An application program changes a property value of a graphic object arranged in an object database. An object manager reads out the property value from the object database and then issues a drawing command. A graphics engine executes the drawing command to configure a memory image of the graphic object on a VRAM to display the image on a liquid crystal display via an LCDC. | 05-05-2011 |
20110109538 | ENVIRONMENT SENSITIVE DISPLAY TAGS - This is directed to dynamic tags or screen savers for display on an electronic device. The tags can include several dynamic elements that move across the display. The particular characteristics of the elements can be controlled in part by the output of one or more sensors detecting the environment of the device. For example, the color scheme used for a tag can be selected based on the colors of an image captured by a camera, and the orientation of the movement can be selected from the output of a motion sensing component. The tag can adjust automatically based on the sensor outputs to provide an aesthetically pleasing display that a user can use as an fashion accessory. | 05-12-2011 |
20110109539 | BEHAVIOR RECOGNITION SYSTEM AND METHOD BY COMBINING IMAGE AND SPEECH - A behavior recognition system and method by combining an image and a speech are provided. The system includes a data analyzing module, a database, and a calculating module. A plurality of image-and-speech relation modules is stored in the database. Each image-and-speech relation module includes a feature extraction parameter and an image-and-speech relation parameter. The data analyzing module obtains a gesture image and a speech data corresponding to each other, and substitutes the gesture image and the speech data into each feature extraction parameter to generate image feature sequences and speech feature sequences. The data analyzing module uses each image-and-speech relation parameter to calculate image-and-speech status parameters. The calculating module uses the image-and-speech status parameters, the image feature sequences, and the speech feature sequences to calculate a recognition probability corresponding to each image-and-speech relation parameter, so as to take a maximum value among the recognition probabilities as a target parameter. | 05-12-2011 |
20110109540 | ACCELEROMETER-BASED TAPPING USER INTERFACE - A CE device for, e.g., displaying the time can incorporate an accelerometer to provide various features and enhancements. For example, tapping of the housing as sensed by the accelerometer may be used for controlling various application modes of the device. | 05-12-2011 |
20110109541 | Display control device for remote control device - A display control device for a remote control device is disclosed. The display control device is configured to have connection with a display device and a remote control device spaced apart from each other. The display device is configured to display an operation screen having multiple icons for accepting an operation directed to a control target apparatus. When display is switched from a first operation screen to a second operation screen having a different icon arrangement, the display control means shifts an icon selection state from one icon selected on the first operation screen to an initial selection icon pre-set on the second operation screen and causes the display device to display a visual effect indicative of a direction from position of the selected one icon to position of the initial selection icon. | 05-12-2011 |
20110109542 | IMAGE DISPLAY DEVICE AND METHOD OF CONTROLLING THE SAME - An image display device includes: a first audio signal input terminal to which a first audio signal is input from a microphone; and a control section adapted to control power supply from a power supply circuit to the microphone based on type information indicative of whether a type of the microphone to be connected to the first audio signal input terminal is a first type which is requiring power supply or a second type which is not requiring power supply. | 05-12-2011 |
20110109543 | METHOD AND APPARATUS FOR DISPLAYING NAVIGATIONAL VIEWS ON A PORTABLE DEVICE - A portable device ( | 05-12-2011 |
20110115697 | USER INFORMATION PROVISION DEVICE, USER INFORMATION PRESENTATION SYSTEM, AND USER INFORMATION PRESENTATION METHOD - There are included: an information provision apparatus to which user's position information is input; and a display device that displays information on the user, the information being output from the information provision apparatus. The information provision apparatus includes: a user information storing unit that stores the information on the user; a display device information storing unit that stores information on an intended display range of the display device; and a display control unit that transmits the information on the user stored in the user information storing unit to the display device when a position of the user falls within the intended display range of the display device. | 05-19-2011 |
20110115698 | DISPLAY APPARATUS, TERMINAL, AND IMAGE DISPLAY METHOD - A display apparatus includes a display unit; a communication unit which communicates with a terminal which displays a personal image provided to the terminal, the terminal being one of a plurality of terminals; and a controller controls the display unit to display a sharing image shared among users of the plurality of terminals on the display unit and changes the sharing image displayed on the display unit in accordance with an input received from the terminal. Accordingly, there is provided a display apparatus which provides a video interface including a sharing image and a personal image, a terminal and an image display method. | 05-19-2011 |
20110115699 | Display control system, display control device, and display control method - Provided is a display control device including a display control unit that, when display information of one content among a plurality of contents is selected on a display screen, creates a next display screen containing display information of at least any of a plurality of contents relevant to the one content, wherein the display information contained in the next display screen is display information of contents according to a selection sequence of a plurality of display information having been selected before among the plurality of contents relevant to the one content. | 05-19-2011 |
20110115700 | System for Displaying Images - A system for displaying images includes a transflective display panel and a light source module oppositely disposed thereto. The light source module includes a light guide plate, a plurality of first light-emitting diodes (LEDs), a plurality of second LEDs, and a lighting control unit electrically connected to the pluralities of first and second LEDs. The light guide plate includes a first portion and a second portion corresponding to a first display region and a second display region of the transflective display panel, respectively. Each first LED is a white light-emitting diode and transmits an emitted light therefrom to the first display region by the first portion of the light guide plate. The plurality of second LEDs includes red, green, and blue LEDs and transmits an emitted light therefrom to the second display region by the second portion of the light guide plate. | 05-19-2011 |
20110115701 | COMMUNICATION TERMINAL, CONTROL METHOD, AND CONTROL PROGRAM - A communication terminal includes: an output unit including a display; an input unit for inputting first operation information; a communication device for transmitting the first operation information to the other communication terminal via the network and receiving second operation information from the other communication terminal via the network; a first output control unit for causing the display to output first information, based on the first operation information, in a first area of the display when the first operation information is input, and for causing the display to output second information, based on the second operation information, in the first area of the display when the second operation information is received; and a comparison unit for comparing a time at which the first operation information is input plus a predetermined period of time with a time at which the second operation information is received, and causing the output unit to output third information based on a result of the comparing. | 05-19-2011 |
20110115702 | Process for Providing and Editing Instructions, Data, Data Structures, and Algorithms in a Computer System - A method and system for computer programming using speech and one or two hand gesture input is described. The system generally uses a plurality of microphones and cameras as input devices. A configurable event recognition system is described allowing various software objects in a system to respond to speech and hand gesture and other input. From this input program code is produced that can be compiled at any time. Various speech and hand gesture events invoke functions within programs to modify programs, move text and punctuation in a word processor, manipulate mathematical objects, perform data mining, perform natural language interne search, modify project management tasks and visualizations, perform | 05-19-2011 |
20110115703 | INFORMATION DISPLAY SYSTEM - A head-worn information display system which includes a display panel. A display mode of information displayed on the display panel is switched automatically according to an active state of a user using the information display system, so as to display information appropriate for the active state of the user. | 05-19-2011 |
20110122058 | Inline control system for therapeutic pad - A controller for use in a therapeutic system having a console disposed in a first housing and a physically separate pad. The controller includes a second housing physically separate from the console and the pad; a processor disposed within the housing and electrically coupled to the console and the pad; a storage medium accessible by the processor and mounted within the second housing; software stored on the storage medium for execution by the processor; a switch coupled to the processor; and a display coupled to the processor. In the illustrative embodiment, the invention further includes a second processor disposed within the housing and electrically coupled to the console and the pad. In a specific implementation, the controller includes software for applying stimulation current to the pad and for regulating heat current applied to the pad. The software includes code for sensing temperature from the pad and for adjusting current to the pad in response to the sensed temperature at the pad and a reference temperature data from the console. The invention enables a thermostimulation system comprising a console disposed in a first housing; a plurality of thermostimulation pads; and a plurality of the inline controllers electrically coupled between the console and a respective one of the pads. | 05-26-2011 |
20110122059 | FINGER SENSING APPARATUS WITH SELECTIVELY OPERABLE TRANSMITTING/RECEIVING PIXELS AND ASSOCIATED METHODS - A finger sensing device may include an integrated circuit (IC) substrate and an array of pixels on the IC substrate. Each pixel may be selectively operable in at least a receiving mode for receiving radiation from an adjacent finger, or a transmitting mode for transmitting radiation into the adjacent finger. The finger sensing device may also include a controller coupled to the array of pixels for selectively operating at least one first pixel in the receiving mode, and while selectively operating at least one second pixel in the transmitting mode. Each pixel may also be selectively operable in a mask mode for neither receiving nor transmitting radiation. The controller may also selectively operate at least one third pixel in the mask mode while selectively operating the at least one first and second pixels in the receiving and transmitting modes. | 05-26-2011 |
20110122060 | OPTICAL NAVIGATION DEVICE - An optical navigation device that can sense the movement of an object, such as a user's finger, so that the movement can control a feature of a consumer digital device such as a cursor on a display screen. The device includes a substrate to which an LED, reflector, and image sensor are attached. Light from the LED is directed by the elliptical reflector toward and through a window that is transparent to the light from the LED and then is reflected off of the user's finger back through the window, through a lens, and onto the image sensor. The reflector is positioned to direct light toward the window at an oblique angle, in the range of 65 to 70 degrees from an angle normal to the window. Further, the reflector is curved to gather light across a large solid angle in the vicinity of the LED. The curved shape of the reflector may be a portion of an ellipsoid and the LED may be located at one of the foci of the ellipsoid, with the window located at the other foci of the ellipsoid. | 05-26-2011 |
20110128216 | METHOD AND APPARATUS FOR A USER INTERFACE - In accordance with an example embodiment of the present invention, an apparatus comprises a first body part, a second body part, and at least one hinge coupling said first body part with said second body part, said at least one hinge enabling relative rotational movement of said first body part and said second body part with respect to each other between at least one closed configuration and at least one open configuration, said apparatus having a tablet configuration such that said at least one hinge is retractable into at least one of said first body part and said second body part while in said at least one open configuration. | 06-02-2011 |
20110128217 | DISPLAY SYSTEM CONTROLLING DISPLAY DATA USING REMOTE CONTROLLER HAVING LANGUAGE CONVERSION KEY - Disclosed herein is a display system controlling display of data on a display according to input of a user through a remote controller. The display system includes the remote controller, a language conversion key for converting the displayed fundamental alphabets into modified alphabets, and a transmitter; and the display including a receiver, an LED module, and a controller for analyzing the control signal received by the receiver, selecting one of modified alphabets belonging to a modified alphabet group, which corresponds to a fundamental alphabet being expressed by the LED module, according to input through the language conversion key and controlling display of the selected modified alphabet through the LED module. Accordingly, a user can easily change a displayed language through the language conversion key. | 06-02-2011 |
20110128218 | INTERACTIVE INPUT SYSTEM AND BEZEL THEREFOR - An interactive input system comprises at least one imaging device having a field of view looking into a region of interest. At least one radiation source emits radiation into the region of interest. A pliable bezel at least partially surrounds the region of interest. The pliable bezel has a reflective surface in the field of view of said at least one imaging device. | 06-02-2011 |
20110128219 | INTERACTIVE INPUT SYSTEM AND BEZEL THEREFOR - An interactive input system comprises at least one imaging device having a field of view looking into a region of interest. At least one radiation source emits radiation into the region of interest. A pliable bezel at least partially surrounds the region of interest. The pliable bezel has a reflective surface in the field of view of said at least one imaging device. | 06-02-2011 |
20110134024 | DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A display apparatus and a control method thereof are disclosed. The display apparatus includes: a connection unit which is to be connected with at least one external device; a signal processing unit which processes an image signal; a display unit which displays an image on the basis of the image signal processed by the signal processing unit; a storage unit which stores therein a setting menu regarding a plurality of functions of the display apparatus; and a controller which controls the signal processing unit so that the setting menu regarding at least one of the plurality of functions of the display apparatus corresponding to the external device which is connected to the connection unit, is displayed through the display unit, and so that the image is displayed through the display unit according to user's setting through the setting menu. | 06-09-2011 |
20110134025 | INPUT DEVICE - A plurality of cylindrical-shaped or spherical-shaped magnets each including the N-pole and the S-pole formed at a predetermined angle interval are rotatably placed between upper case and lower case, and a plurality of magnetic detection elements are disposed opposite to the magnets at a predetermined gap. When a plurality of small magnets having a diameter of about 2 to 3 mm are rotated by the finger, from the rotation direction and the rotation angle of the magnets the operation direction and the operation amount of the finger can be detected. Therefore, it is possible to obtain an input device capable of making an entire input device with a low dimension so as to make the device thin, and capable of reliable operation. | 06-09-2011 |
20110134026 | IMAGE DISPLAY APPARATUS AND METHOD FOR OPERATING THE SAME - In accordance with an aspect of the present invention, a method for controlling an image display apparatus includes obtaining, by the image display apparatus, emotion information associated with a user of the image display apparatus, determining a contents to be recommended for the user among a plurality of different contents, based on the obtained emotion information associated with the user, and selectively or automatically reproducing one of the recommended contents at the image display apparatus. | 06-09-2011 |
20110134027 | INPUT DEVICE - Used in a multimedia operating system for controlling multiple computers, an input device includes a memory unit adapted for storing the ID codes of the computers, a control unit adapted for receiving an external switching signal and fetching the ID code from the memory unit subject to the content of the received external switching signal and then producing a switching packet containing the fetched ID code, a wireless transmitter and receiver unit adapted for transmitting the switching packet produced by the control unit to the computers, and a power supply unit adapted for providing the input device with the necessary working power supply. | 06-09-2011 |
20110134028 | COMMUNICATION TERMINAL DEVICE, COMMUNICATION METHOD, AND COMMUNICATION PROGRAM - A communication terminal device includes: a display for displaying image information; a communication device for transmitting and receiving information to and from another terminal via a network; an input device for entering command information and image information, and a processor configured to perform a first control for causing the display to show, based on input of first command information from the input device during display of a first image, a second image, and transmitting first information to the other terminal via the communication device and perform a second control for causing the display to show the first image based on input of second command information from the input device during display of the second image, and transmitting second information to the other terminal via the communication device. | 06-09-2011 |
20110141006 | DETECTING DOCKING STATUS OF A PORTABLE DEVICE USING MOTION SENSOR DATA - Methods for operating a portable media device are provided. The method includes determining an orientation angle of the portable media device and a frequency spectrum associated with a motion of the portable media player. Based on the orientation angle and the frequency spectrum, the portable media player can determine a motion status and select a mode of operation based on the motion status. In addition, the method also includes determining whether the portable media player is in a dock, resting on a surface, or being handled by a person. The method further includes determining whether the portable media player is located in a moving vehicle, in a stationary vehicle, being held by a moving person, or being held by a stationary person. | 06-16-2011 |
20110141007 | DATA PROCESSING APPARATUS, DATA PROCESSING SYSTEM, AND DISPLAY CONTROL METHOD FOR CONTROLLING DISPLAY IN DATA PROCESSING APPARATUS - A data processing apparatus includes a control apparatus configured to control the data processing apparatus and a display apparatus configured to display information, wherein the control apparatus includes a first display control unit configured to control display in the display apparatus and a transfer unit configured to transfer data to be displayed by the display apparatus to the display apparatus, wherein the display apparatus includes a second display control unit configured to control display in the display apparatus and a switching unit configured to switch between the first display control unit and the second display control unit, and wherein, after the switching unit switches from the first display control unit to the second display control unit, the second display control unit controls display in the display apparatus based on the data transferred by the transfer unit. | 06-16-2011 |
20110141008 | MULTIMEDIA PLAYING DEVICE AND OPERATION METHOD THEREOF - A multimedia playing device and an operation method thereof are provided. The multimedia playing device stores an image having a plurality of multimedia options. The operation method sequentially includes the following steps. A portion of the image is defined as a projecting area. The portion of the image corresponding to the projecting area is projected. A movement of the multimedia playing device is sensed. According to the movement of the multimedia playing device, the projecting area is moved in the image. Thereby, the multimedia options are presented in a projecting mode, and a user can browse the multimedia options in a dynamic sensing mode thus to facilitate search and selection. | 06-16-2011 |
20110141009 | IMAGE RECOGNITION APPARATUS, AND OPERATION DETERMINATION METHOD AND PROGRAM THEREFOR - An image for an operator is extracted, and an operation determination unit employs a relationship, relative to a marker of an operation input system, for the operator, who is standing behind the marker when viewed by a video camera. When a body part of the operator comes to the front of an operation plane, as viewed by the video camera, the operation determination unit determines that an action for an input operation has been started, and examines the shape and the movement of each individual body part (an open hand, two fingers that are being held up, etc.), and determines whether the shape or the movement is correlated with one of operations that are assumed in advance. | 06-16-2011 |
20110141010 | GAZE TARGET DETERMINATION DEVICE AND GAZE TARGET DETERMINATION METHOD - The object at which a user is gazing Is accurately determined from among objects displayed on a screen. A gaze target determination device ( | 06-16-2011 |
20110141011 | METHOD OF PERFORMING A GAZE-BASED INTERACTION BETWEEN A USER AND AN INTERACTIVE DISPLAY SYSTEM - The invention describes a method of performing a gaze-based interaction between a user ( | 06-16-2011 |
20110148752 | Mobile Device with User Interaction Capability and Method of Operating Same - In one embodiment a method of operating a mobile device includes sensing either an orientation or a movement of the mobile device, determining a command based on the sensed orientation or sensed movement, sensing a proximity of an object in relation to at least a portion of the mobile device, and executing the command upon the proximity of the object being sensed. In another embodiment, a method of operating a mobile device governs a manner of interaction of the mobile device relative to one or more other mobile devices. In at least some embodiments, at least one of the mobile devices includes an accelerometer and an infrared proximity sensor, and operation of the mobile device is determined based upon signals from those components. | 06-23-2011 |
20110148753 | Identifying a characteristic of an individual utilizing facial recognition and providing a display for the individual - A method may include automatically remotely identifying at least one characteristic of an individual via facial recognition; and providing a display for the individual, the display having a content at least partially based on the identified at least one characteristic of the individual. A system may include a facial recognition module configured for automatically remotely identifying at least one characteristic of an individual via facial recognition; and a display module coupled with the facial recognition module, the display module configured for providing a display for the individual, the display having a content at least partially based on the identified at least one characteristic of the individual. | 06-23-2011 |
20110148754 | PROJECTION APPARATUS, DISPLAY APPARATUS, INFORMATION PROCESSING APPARATUS, PROJECTION SYSTEM AND DISPLAY SYSTEM - A communication terminal is allowed to make data communication by simple processing. A communication unit is allowed to make data communication with a communication terminal when the detected position of the communication terminal remains unchanged for a predetermined time period or longer, within a range in which an image is projected. | 06-23-2011 |
20110148755 | USER INTERFACE APPARATUS AND USER INTERFACING METHOD BASED ON WEARABLE COMPUTING ENVIRONMENT - Provided is a user interface apparatus based on wearable computing environment, which is worn on a user, including: a sensor unit including at least one sensor worn on the user and outputting a plurality of sensing signals according to a positional change of a user's arm or a motion of a user's finger; and a signal processing unit outputting a user command corresponding to the 3D coordinates of the user's arm and the motion of the user's finger from the plurality of sensing signals output from the sensor unit and, controlling an application program running in a target apparatus using the user command. | 06-23-2011 |
20110156998 | METHOD FOR SWITCHING TO DISPLAY THREE-DIMENSIONAL IMAGES AND DIGITAL DISPLAY SYSTEM - A method for switching to display three-dimensional (3D) images and a digital display system are provided. In the method, a digital display is activated to receive image data comprising at least one image and to detect a triggering signal, which is generated by detecting whether a 3D glasses is being used through a software means or a hardware means. According to the triggering signal, the digital display switches between formats of two-dimensional (2D) image and 3D image, so as to display the images. | 06-30-2011 |
20110156999 | GESTURE RECOGNITION METHODS AND SYSTEMS - Gesture recognition methods and systems are provided. First, a plurality of gesture templates are provided, wherein each gesture template defines a first gesture characteristic and a corresponding specific gesture. Then, a plurality of images is obtained, and a multi-background model is generated accordingly. At least one object image is obtained according to the multi-background model, wherein the object image includes at least an object having a plurality of edges. The included angles of any two adjacent edges of the object image are gathered as statistics to obtain a second gesture characteristic corresponding to the object image. The second gesture characteristic of the object image is compared with the first gesture characteristic of each gesture template. The specific gesture corresponding to the first gesture characteristic is obtained, when the second gesture characteristic is similar to the first gesture characteristic. | 06-30-2011 |
20110157000 | PROJECTION SYSTEM AND METHOD - A projection system includes first and second cameras, a projector, and a processing unit. The first camera captures an object to obtain an image of the object. The projector projects the image of the object and a number of symbols to a projection region. The symbols in the projection region forms controlling symbols. The second camera captures the projection region to obtain an image of the projection region. The image of the projection region is detected to determine whether one controlling symbol is selected. If a controlling symbol is selected, the processing unit controls the first camera, the second camera, or the projector correspondingly. | 06-30-2011 |
20110157001 | METHOD AND APPARATUS FOR DISPLAY FRAMEBUFFER PROCESSING - Various methods for display framebuffer processing are provided. One example method include determining, via a processor, that update criteria associated with a display region has been satisfied, and comparing current frame data for the display region to subsequent frame data for the display region to determine frame data changes associated with the display region. In this regard, the comparing is performed in response to the update criteria being satisfied. The example method may also include facilitating presentation of the frame data changes within the display region on a display. Similar and related example methods and example apparatuses are also provided. | 06-30-2011 |
20110157002 | HAPTIC FEEDBACK DEVICE - A haptic feedback device includes a screen and a piezoelectric vibrator partially connected with the screen. The screen includes a display viewing area and a frame surrounding the display viewing area. The piezoelectric vibrator engaged with the screen defines a moving portion having a projecting portion extending toward the screen. The screen defines a cavity and the projecting portion of the piezoelectric vibrator is at least partially accommodated in the cavity. | 06-30-2011 |
20110157003 | HAPTIC FEEDBACK DEVICE - A haptic feedback apparatus includes a screen and a piezoelectric vibrator partially connected with the screen. The screen comprises a display viewing area and a frame surrounding the display viewing area. The piezoelectric vibrator engaged with the screen defines a moving portion having a projecting portion extending from one distal end of the moving portion toward the screen. Wherein, the screen defines a cavity and the projecting portion of the piezoelectric vibrator is at least partially accommodated in the cavity. The manufacture of the haptic feedback apparatus is simple and low-cost. | 06-30-2011 |
20110157004 | Information processing apparatus,information processing method, program, control target device, and information processing system - There is provided a remote commander including an acquisition section which acquires layout information having one or a plurality of pieces of object information from a control target device via a communication section, the object information being formed by associating command identification information for identifying a command with respect to the control target device, object identification information for identifying an object, and coordinate information indicating a position of the object on a screen with each other, and a display control section which causes the object to be displayed at the position on the screen indicated by the coordinate information, the object being identified by the object identification information with respect to each of the pieces of object information that the layout information acquired by the acquisition section has. | 06-30-2011 |
20110157005 | HEAD-MOUNTED DISPLAY - A head-mounted display includes detection means that is mounted on the body of a user for detecting coordinates of input, which are coordinates written/drawn the user into a two-dimensional detection area, coordinates conversion means for detecting start position coordinates of the input by the user in the two-dimensional detection area based on the coordinates of input, converting the start position coordinates into initial position coordinates in a display area of an image display, and determining trajectory coordinates in the display area by using the initial position coordinates and a positional relationship between the coordinates of input and the start position coordinates, and the trajectory image generation means for generating a trajectory image of the input based on the trajectory coordinates and outputting to the image display. | 06-30-2011 |
20110157006 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM - An information processing apparatus includes a detection section configured to detect an operating body in proximity to a display screen, an identification section configured to identify whether the operating body detected by the detection section is in proximity to a selection item included in the display screen, and a display control section configured to temporarily superimpose and display a region including selection item content of the selection item over a previously displayed display region when the identification section identifies the operating body as being in proximity to the selection item and configured to end a display of the region including the selection item content of the selection item when the detection section no longer detects the operating body as being in proximity to the display screen. | 06-30-2011 |
20110157007 | OPTICAL PEN AND OPERATING METHOD OF THE SAME - An optical pen comprises a housing, a pen body, a pen tip, a switch and a control unit. The pen body, disposed inside the housing, has a hinge mechanism allowing the pivoting of the pen body. The pen tip is disposed at a front end of the pen body and protruded from the housing. When the pen tip touches a projection surface, the pen body is rotated through the hinge mechanism to trigger the switch. The switch disposed inside the housing is electrically connected to the control unit. When the switch is triggered, the control unit outputs an electrical signal. | 06-30-2011 |
20110157008 | DEVICE-CONTROL SYSTEM, TERMINAL APPARATUS, AND DEVICE - A device-control system includes a device operated by a remote controller and a terminal apparatus displaying an image. The device-control system controls the device via a network. The terminal apparatus includes an output section, a terminal's receiver for receiving an image similar to the appearance of the remote controller for operating the device, a terminal's output processor for controlling the output section to output the image received by the terminal's receiver, and a terminal's transmitter for transmitting operational information generated by an operation performed by use of the terminal apparatus with respect to the image outputted by the output section. The device includes a device's processor for performing processing in accordance with the operational information. | 06-30-2011 |
20110157009 | DISPLAY DEVICE AND CONTROL METHOD THEREOF - A display device and a control method thereof are provided. The display device comprises a camera obtaining an image including a gesture of a user and a controller extracting the image of the gesture from the obtained image and executing a function mapped to the extracted gesture when the range of the gesture exceeds a threshold corresponding to the user. The method of controlling the display device executes functions respectively corresponding to specific gestures of users when the gestures exceed thresholds respectively corresponding to the users to operate the display device in a manner most suitable for the range of the gesture of each user. | 06-30-2011 |
20110157010 | ELECTROMECHANICAL DISPLAY DEVICES AND METHODS OF FABRICATING THE SAME - MEMS devices include materials which are used in LCD or OLED fabrication to facilitate fabrication on the same manufacturing systems. Where possible, the same or similar materials are used for multiple layers in the MEMS device, and use of transparent conductors for partially transparent electrodes can be avoided to minimize the number of materials needed and minimize fabrication costs. Certain layers comprise alloys selected to achieve desired properties. Intermediate treatment of deposited layers during the manufacturing process can be used to provide layers having desired properties. | 06-30-2011 |
20110163944 | INTUITIVE, GESTURE-BASED COMMUNICATIONS WITH PHYSICS METAPHORS - A user can make an intuitive, physical gesture with a first device, which can be detected by one or more onboard motion sensors. The detected motion triggers an animation having a “physics metaphor,” where the object appears to react to forces in a real world, physical environment. The first device detects the presence of a second device and a communication link is established allowing a transfer of data represented by the object to the second device. During the transfer, the first device can animate the object to simulate the object leaving the first device and the second device can animate the object to simulate the object entering the second device. In some implementations, in response to an intuitive, gesture made on a touch sensitive surface of a first device or by physically moving the device, an object can be transferred or broadcast to other devices or a network resource based on a direction, velocity or speed of the gesture. | 07-07-2011 |
20110163945 | PORTABLE DEVICE FOR CONTROLLING INSTRUCTION EXECUTION BY MEANS OF ACTUATORS PLACED ON A REAR SURFACE - The invention relates to a device (D) for controlling the execution of instructions, that comprises: i) a receptacle (RP) that can be held by a user in at least one hand and has a rear surface (FAR) provided with rear actuators (AC); ii) a storing means (MS) capable of storing, for at least one application, a table of correspondence between at least one operation mode and a set of selected instructions associated with icons of relative positions defined according to a selected arrangement; and ii) control means (MC) for, in case an application operation mode is selected, determining the table corresponding thereto and associating the instructions contained in said table to actuation types of the rear actuators (AC) selected on the basis of the relative positions of icons respectively associated with these instructions, at least some of said icons being displayed on at least one screen (EC | 07-07-2011 |
20110163946 | SIMULATION OF THREE-DIMENSIONAL TOUCH SENSATION USING HAPTICS - An apparatus includes a processing system, a display, and a plurality of haptic actuators. The display and the haptic actuators are coupled to the processing system. The processing system is configured to control the haptic actuators to simulate movement in a particular direction corresponding to movement in the particular direction in a visual depiction in the display. | 07-07-2011 |
20110163947 | Rolling Gesture Detection Using a Multi-Dimensional Pointing Device - A system and a method for performing a rolling gesture using a multi-dimensional pointing device. An initiation of a gesture by a user of the multi-dimensional pointing device is detected. A rolling gesture metric corresponding to performance of a rolling gesture comprising rotation of the multi-dimensional pointing device about a longitudinal axis of the multi-dimensional pointing device is determined. Information corresponding the rolling gesture metric is conveyed to a client computer system, wherein the client computer system is configured to manipulate an object in a user interface of the client computer system in accordance with the rolling gesture metric. | 07-07-2011 |
20110163948 | METHOD SYSTEM AND SOFTWARE FOR PROVIDING IMAGE SENSOR BASED HUMAN MACHINE INTERFACING - Disclosed is a method system and associated modules and software components for providing image sensor based human machine interfacing (“IBHMI”). According to some embodiments of the present invention, output of an IBHMI may be converted into an output string or into a digital output command based on a first mapping table. An IBHMI mapping module may receive one or more outputs from an IBHMI and may reference a first mapping table when generating a string or command for a first application running the same or another functionally associated computing platform. The mapping module may emulate a keyboard, a mouse, a joystick, a touchpad or any other interface device compatible, suitable or congruous with the computing platform on which the first application is running. | 07-07-2011 |
20110169725 | FUNCTION MEASURING DEVICE - [PROBLEM TO BE SOLVED] To provide a newly function measurement apparatus capable of measuring a function of a person while a test subject moves a whole body thereof. | 07-14-2011 |
20110169726 | EVOLVING UNIVERSAL GESTURE SETS - In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. Gesture recognition data, used to recognize gestures from captured data representative of a user's input gestures, may be evolved based on captured data from a plurality of users. A common set or default set of gesture recognition data may be evolved by selecting a plurality of users for tracking. Captured data of the plurality of users may be processed to identify input gesture data for the plurality of users, and the gesture recognition data may be evolved based on features of the input gesture data that is common to multiple users. The evolved gesture recognition data may be implemented not only for the users tracked, but for users not tracked. An identifier may identify when the evolved gesture recognition data applies and implement the evolved gesture recognition data when the identifier is present. | 07-14-2011 |
20110169727 | INTERACTIVE INPUT SYSTEM AND ILLUMINATION SYSTEM THEREFOR - An interactive input system includes at least one illumination source emitting radiation into a region of interest; at least one imaging assembly capturing image frames of the region of interest, the at least one illumination source being in the field of view of the at least one imaging assembly; and a controller communicating with the at least one illumination source, the controller controlling the intensity of radiation emitted by the at least one illumination source during image frame capture. | 07-14-2011 |
20110169728 | ROTATABLE DISPLAY DEVICE AND IMAGE DISPLAYING METHOD THEREOF - A rotatable display device and an image displaying method thereof are provided. In the present method, a plurality of display signal formats is defined in the rotatable display device, wherein each of the display signal formats corresponds to a display direction of the rotatable display device. Then, one of the display signal formats is provided to a host according to the display direction of the rotatable display device. Finally, the rotatable display device displays an image signal which is directly outputted according to the display signal format received by the host. | 07-14-2011 |
20110169729 | Method and an apparatus for performing interaction between a mobile device and a screen - A method for performing an interaction between a mobile device and a screen having a plurality of NFC tags comprising data which can be read by said mobile device by using an NFC reader module of said mobile device to read a tag, wherein an image which is part of an application service is displayed on said screen such that one or more tags of the screen correspond to the indication of respective inputs or selections which the user may choose through his mobile phone when using said application service, wherein said image being displayed on said screen is controlled by a server on which said application service is running, said server being connected to said mobile phone of said user through a network link. | 07-14-2011 |
20110169730 | SIGHT LINE INPUT USER INTERFACE UNIT, USER INTERFACE METHOD, USER INTERFACE PROGRAM, AND RECORDING MEDIUM WITH USER INTERFACE PROGRAM RECORDED - A sight line input user interface unit, a sight line input user interface method, a sight line input user interface program, and a recording medium with the program recorded in which the user's intention can be properly recognized to prevent a false judgment are provided. The sight line input user interface unit includes information display elements ( | 07-14-2011 |
20110175801 | Directed Performance In Motion Capture System - Techniques for enhancing the use of a motion capture system are provided. A motion capture system tracks movement and audio inputs from a person in a physical space, and provides the inputs to an application, which displays a virtual space on a display. Bodily movements can be used to define traits of an avatar in the virtual space. The person can be directed to perform the movements by a coaching avatar, or visual or audio cues in the virtual space. The application can respond to the detected movements and voice commands or voice volume of the person to define avatar traits and initiate pre-scripted audio-visual events in the virtual space to provide an entertaining experience. A performance in the virtual space can be captured and played back with automatic modifications, such as alterations to the avatar's voice or appearance, or modifications made by another person. | 07-21-2011 |
20110175802 | METHOD AND SYSTEM FOR OPERATING ELECTRIC APPARATUS - A method and a system for operating an electric apparatus are provided. In the present invention, first, an image capturing unit is enabled for capturing an image. Next, a palm component on the image is detected. Afterwards, a center of mass in the palm component is calculated according to a principal component analysis (PCA) algorithm, so as to use the center of mass to simulate a cursor. Then, a width of the palm of the palm component is calculated. Finally, the width of the palm is compared with a threshold so as to execute a click action if the width of the palm is greater than the threshold. | 07-21-2011 |
20110175803 | SYSTEM AND METHOD OF SCREEN MANIPULATION USING HAPTIC ENABLE CONTROLLER - An interface system and a method for manipulating a display are disclosed. The interface system includes a display having a scroll area and a cursor presented thereon, a controller for manipulating a position of the cursor on the display, and a haptic device for generating a plurality of tactile feedbacks to a user through the controller, wherein a movement of the cursor across a peripheral edge of the scroll area of the display results in the haptic device generating a first tactile feedback of the plurality of tactile feedbacks representing a scroll mode, and wherein a movement of the cursor while the cursor is positioned within the scroll area of the display results in the haptic device generating a second tactile feedback of the plurality of tactile feedbacks representing a scroll rate of a visual feedback presented on the display. | 07-21-2011 |
20110175804 | EVENT GENERATION BASED ON PRINT PORTION IDENTIFICATION - An optical scanner is configured to scan multiple print portions of a body part such as a finger. The optical scanner identifies a first one of the print portions in an area of an optical surface. An event such as launching an application is generated based on identifying the first print portion in the area of the optical surface. In addition, various events can be generated based on different combinations of print portions in different areas of the optical surface. | 07-21-2011 |
20110175805 | MOTION CONTROLLABLE DUAL DISPLAY PORTABLE MEDIA DEVICE - Methods and apparatus of interaction with and control of a portable media device through applied motion. In the embodiments described herein, the portable media device can include at least two displays arranged such that only one can be presented at a time. The portable media device can be configured to operate as a electronic book (e-book) having at least one electrophoretic type display having a refresh time less than an amount of time to rotate the e-book to view the refreshed display. | 07-21-2011 |
20110175806 | ELECTRONIC DEVICE FOR USE IN MOTION DETECTION AND METHOD FOR OBTAINING RESULTANT DEVIATION THEREOF - An electronic device utilizing a nine-axis motion sensor module, capable of accurately outputting a resultant deviation including deviation angles in a 3D reference frame is provided. The present invention provides a novel comparison and compensation to accurately obtain a resultant deviation including deviation angles of the electronic device under the presence of external and/or internal interferences including the ones caused by undesirable electromagnetic fields and the ones associated with undesirable external forces and axial accelerations. The output of the nine-axis motion sensor module of the present invention including a rotation sensor, an accelerometer and a magnetometer can be advantageously obtained and compensated with a comparison comparing different states of the motion sensor module such that an updated state associated with the output and the resultant deviation angles of the nine-axis motion sensor module are preferably obtained in an absolute manner with the undesirable external interferences being effectively excluded. | 07-21-2011 |
20110181503 | REPRODUCTION DEVICE, REPRODUCTION SYSTEM AND NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM - A reproduction device comprises an operation input unit that an operation command is input, a communication unit that connects with another reproduction device via a communication network, and a reproduction unit that reproduces data. Furthermore, a control unit of the reproduction device that, when an operation command that specifies the data that is to be coordinated and reproduced by each of the another reproduction device and the reproduction unit is input to the operation input unit, transmits a coordinated reproduction command for coordinating and reproducing the specified data to the another reproduction device via the communication unit, acquires the specified data to be reproduced and causes the reproduction unit to reproduce the acquired data. | 07-28-2011 |
20110181504 | Electronic Device - Provided is an electronic device which easily selects and executes an application relating to characters inputted by a user. A cellular phone ( | 07-28-2011 |
20110187637 | Tactile Input Apparatus - A tactile input apparatus comprising ring-shaped elements configured to conform to a user's fingers, tactile capacitance sensors operably affixed to periphery of the ring-shaped elements, and a control unit, is provided. Each tactile capacitance sensor is positioned parallel to the underside of each finger and reads a change in capacitance on contacting the user's thumb or palm. The control unit, in wired or wireless electronic communication with each tactile capacitance sensor and with the user's computing device, continuously transmits capacitance readings multiple times per second to a software on the computing device. The software monitors and processes the capacitance readings from the control unit and controls output to the computing device. The software determines logic and then enacts single or multiple custom outputs. The positioning of the tactile capacitance sensors on the periphery of the ring-shaped elements prevents confinement of the user's fingers and allows mobility of the user's fingers. | 08-04-2011 |
20110187638 | Interactive module applied in 3D interactive system and method - An interactive module applied in a 3D interactive system calibrates a location of an interactive component or calibrates a location and an interactive condition of a virtual object in a 3D image, according to a location of a user. In this way, even the location of the user changes so that the location of the virtual object seen by the user changes as well, the 3D interactive system still can correctly decide an interactive result according to the corrected location of the interactive component, or according to the corrected location and corrected interactive condition of the virtual object. | 08-04-2011 |
20110187639 | DUAL-MODE INPUT DEVICE OPERABLE IN A SELECTED ONE OF A RELATIVE COORDINATE MODE AND AN ABSOLUTE COORDINATE MODE - A dual-mode input device includes a relative coordinate generator disposed in a casing for detecting motion of the casing and for generating relative coordinate information based on detected motion of the casing, and a processing unit. The processing unit includes a coordinate storing module for storing absolute coordinate information, an absolute coordinate generator for generating updated absolute coordinate information based on the relative coordinate information received from the relative coordinate generator and the absolute coordinate information received from the coordinate storing module, and for storing the updated absolute coordinate information in the coordinate storing module, and an output selecting module operable in one of a relative coordinate mode, in which the output selecting module outputs the relative coordinate information, and an absolute coordinate mode, in which the output selecting module outputs the absolute coordinate information. | 08-04-2011 |
20110187640 | Wireless Hands-Free Computing Headset With Detachable Accessories Controllable by Motion, Body Gesture and/or Vocal Commands - A remote control microdisplay device that uses hand movement, body gesture, head movement, head position and/or vocal commands to control the headset, a peripheral device, a remote system, network or software application, such as to control the parameters of a field of view for the microdisplay within a larger virtual display area associated with a host application, a peripheral device or host system. The movement and/or vocal commands are detected via the headset and/or detachable peripheral device connected to the headset microdisplay device via one or more peripheral ports. | 08-04-2011 |
20110193771 | ELECTRONIC DEVICE CONTROLLABLE BY PHYSICAL DEFORMATION - An electronic apparatus includes a display; at least two corners and, among them, at least two bendable corners; one or more sensors arranged to detect the state of bending of at least two corners, here referred to as actuating corners, among the at least two bendable corners; and a controller for controlling the position within the display of an element displayed on the display based on the state of bending of the actuating corners. A method and a computer program are also disclosed. | 08-11-2011 |
20110193772 | Biaxial rotary display module - A biaxial rotary display module is disclosed, which includes a first casing, a rail disposed on the first casing, a rotary piece arranged in the rail, a second casing having s display panel, and two hinges. The rotary piece has a pivoting portion pivoted to the first casing, and the rotary piece is rotated in the rail. The hinges are disposed on opposite sides of the pivoting portion for connecting the second casing and the rotary piece, thereby causing the second casing able to be pivoted relative to the first casing via the two hinges and swiveled relative to the first casing via the rotary piece. | 08-11-2011 |
20110193773 | HEADS-UP DISPLAY FOR A GAMING ENVIRONMENT - A heads-up display associated with a user manipulated object in a gaming environment, particularly a racing game. According to one aspect the heads-up display may reduce the distance a user's eye travels between a focal area and the heads-up display. | 08-11-2011 |
20110193774 | INFORMATION PROCESSING DEVICE - In an operation control device for controlling the operation of an appliance such as television receiver, a human-perceiving sensor is used as a unit for detecting movements of the operator. Then, continuous movements are extracted therefrom, such as hand-waving movements performed by the operator with an intention of operating the appliance such as television receiver. Moreover, if the operator performs one and the same movement during a time-period which is longer than a certain constant time-period, a control determined in advance is exerted over the operation control device. Also, the situation in which the movements of the operator are detected is displayed on the appliance such as television receiver, so that the operator is permitted to perform a more accurate movement by the resultant feedback effect. | 08-11-2011 |
20110199289 | INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, HANDHELD APPARATUS, AND CONTROL METHOD - [Object] To provide techniques of an input apparatus, a control apparatus, and the like with which an image displayed on a screen can be prevented from being moved unintentionally in a case where a user operates an operation section provided to the input apparatus. | 08-18-2011 |
20110199290 | DIGITAL SIGNS - A method for pairing a control device with a digital sign is provided. The method includes receiving control device geometric attributes and digital sign geometric attributes, determining a digital sign identification based on the control device geometric attributes and the digital sign geometric attributes, and transmitting the digital sign identification to the control device. The control device geometric attributes may define geometric attributes of the control device. The digital sign geometric attributes may define geometric attributes of the digital sign that the control device is attempting to control. The digital sign identification may define the digital sign that the control device. | 08-18-2011 |
20110199291 | GESTURE DETECTION BASED ON JOINT SKIPPING - A system is disclosed for detecting or confirming gestures performed by a user by identifying a vector formed by non-adjacent joints and identifying the angle the vector forms with a reference point. Thus, the system skips one or more intermediate joints between an end joint and a proximal joint closer to the body core of a user. Skipping one or more intermediate joints results in a more reliable indication of the position or movement performed by the user, and consequently a more reliable indication of a given gesture. | 08-18-2011 |
20110199292 | Wrist-Mounted Gesture Device - A wrist-mounted gesture device, system, and method is disclosed. The wrist-mounted gesture device includes at least one accelerometer adapted to detect acceleration caused by one or more gestures of the user. The accelerometer provides data to a microcontroller which is adapted to interpret the gesture data and match it with corresponding predefined gestures. The device includes wireless connection circuitry which allows the device to be wirelessly interfaced with an electronic device. The electronic device may be a device within the living space or environment of the user. A highly effective gesture system is ideally utilized in order to produce accurately recognizable gestures, in either one, two, or three dimensions. In certain embodiments, movements corresponding to movements toward and away from numbers in a standard keypad arrangement can be used, and vertical and horizontal movements corresponding to affirmative and negative gestures are used. In other embodiments, the device can be used in a mapping system of a room or interior space to assist users in finding objects or locations in such a room. In still other embodiments, the device can be used to assess tremors in a patient. | 08-18-2011 |
20110199293 | MULTI-ORIENTATION HANDWRITING TRACE INPUT DEVICE AND METHOD FOR ROTATING ITS COORDINATE PLANE - The present invention provides a multi-orientation handwriting trace input device and a method for rotating a coordinate plane thereof, and relates to the field of computer peripheral input devices. The device comprises a coordinate indicator and a data receiver, wherein the coordinate indicator enables trace input within a predefined coordinate plane of the data receiver. The device further comprises a data communication unit and a coordinate plane rotation unit. The data communication unit is provided the data receiver. A processor of the data receiver transfers operation state information of the data communication unit to the coordinate plane rotation unit. The coordinate plane rotation unit rotates the coordinate plane of the data receiver according to the received operation state information of the data communication unit. The present invention can be implemented in a simple and reliable way, and is easy for use by both right-handed and left-handed users. The present invention is adaptive for use at various angles and can find a wide range of applications. | 08-18-2011 |
20110199294 | DEVICES, SYSTEMS AND METHODS OF CAPTURING AND DISPLAYING APPEARANCES - Some demonstrative embodiments of the invention include systems, devices and/or methods enabling appearance comparison. The system, according to some demonstrative embodiments, may include at least one interactive imaging and display station. The station may include, for example, a mirror-display device capable of selectably operating in either or both a mirror mode or a display mode; an imaging device to capture one or more appearances appearing in a field of view in front of the mirror-display device; and/or an image control unit to select the mode of operation of the mirror-display device according to a user command. Other embodiments are described and claimed. | 08-18-2011 |
20110199295 | Human Interface Input Acceleration System - A method and system for transmitting data to and from a hand-held host device are disclosed. An accessory device for interfacing with a host device includes a communication channel designed to establish a bidirectional data link between the accessory device and the host device. The accessory device also includes a storage unit communicatively coupled to the communication channel. The storage unit is designed to store various data. In addition, at least a first data is selectively transmitted from the stored data of the accessory device to the host device through the established bidirectional data link. | 08-18-2011 |
20110205147 | Interacting With An Omni-Directionally Projected Display - Concepts and technologies are described herein for interacting with an omni-directionally projected display. The omni-directionally projected display includes, in some embodiments, visual information projected on a display surface by way of an omni-directional projector. A user is able to interact with the projected visual information using gestures in free space, voice commands, and/or other tools, structures, and commands. The visual information can be projected omni-directionally, to provide a user with an immersive interactive experience with the projected display. The concepts and technologies disclosed herein can support more than one interacting user. Thus, the concepts and technologies disclosed herein may be employed to provide a number of users with immersive interactions with projected visual information. | 08-25-2011 |
20110205148 | Facial Tracking Electronic Reader - Facial actuations, such as eye actuations, may be used to detect user inputs to control the display of text. For example, in connection with an electronic book reader, facial actuations and, particularly, eye actuations, can be interpreted to indicate when the turn a page, when to provide a pronunciation of a word, when to provide a definition of a word, and when to mark a spot in the text, as examples. | 08-25-2011 |
20110205149 | MULTI-MODAL INPUT SYSTEM FOR A VOICE-BASED MENU AND CONTENT NAVIGATION SERVICE - A system and method for providing voice prompts that identify task selections from a list of task selections in a vehicle, where the user employs an input device, such as a scroll wheel, to activate a particular task and where the speed of the voice prompt increases and decreases depending on how fast the user rotates the scroll wheel. | 08-25-2011 |
20110205150 | ELECTRONIC DEVICE - According to one embodiment, an electronic device includes a main body unit, a hard disk drive housed inside the main body unit and including a head which performs reading and writing of data to a magnetic disk, a display unit pivotable between a first position where the display unit is laid parallel to the main body unit and a second position where the display unit is raised relative to the main body unit, a sensor which senses an angle at which the display unit is positioned, and a control unit. The control unit retracts the head to a retraction position when the sensor senses a change in the angle of the display unit. | 08-25-2011 |
20110205151 | Methods and Systems for Position Detection - A computing device, such as a desktop, laptop, tablet computer, a mobile device, or a computing device integrated into another device (e.g., an entertainment device for gaming, a television, an appliance, kiosk, vehicle, tool, etc.) is configured to determine user input commands from the location and/or movement of one or more objects in a space. The object(s) can be imaged using one or more optical sensors and the resulting position data can be interpreted in any number of ways to determine a command, including 2-dimensional and 3-dimensional movements with or without touch. | 08-25-2011 |
20110205152 | LOCATION-BASED INFORMATION - In response to a positional relationship (distance, or a combination of distance and angle) between an information output unit and a user who uses information displayed in the display unit, a control unit changes the amount of information to be displayed in the display unit based on an information level, the number of pieces of information, a scrolling speed or a cycle. In some aspects, the control unit controls the information output unit to increase the amount of information to be output when the user is close to the display unit. | 08-25-2011 |
20110205153 | DATA TRANSMISSION DEVICE, DATA TRANSMISSION METHOD, DATA COMMUNICATION SYSTEM, PROGRAM, AND RECORDING MEDIUM - A mobile phone ( | 08-25-2011 |
20110210913 | DISPLAY AND WRITING DEVICE - A display and writing device is disclosed that comprises a flexible interface module for providing a writing surface and displaying a captured content; a scanner for scanning the writing surface and producing the captured content accordingly; at least two rollers for driving the flexible interface module to pass the scanner; and an operating module used to operate the flexible interface module, the scanner and the rollers. | 09-01-2011 |
20110215995 | INFORMATION PROCESSING DEVICE, METHOD AND PROGRAM - An information processing device includes a capture section capturing an image of an object, an acquisition section acquiring the image captured by the capture section, a calculation section calculating vibration information on the basis of the image acquired by the acquisition section, a determination section determining a vibration command on the basis of the vibration information calculated by the calculation section, and a control section executing predetermined processing on the basis of the vibration command determined by the determination section. | 09-08-2011 |
20110215996 | COMPUTER AND METHOD FOR OPERATING THE SAME - A computer includes a touch element, a projection module, and a calculating element. The touch element has an operating area. When the operating area is touched, the touch element generates an input signal. The projection module is used for projecting an operating image in the operating area. The calculating element is connected with the projection module and the touch element and is used for processing the input signal to transform the input signal to a touch signal. | 09-08-2011 |
20110215997 | METHOD AND APPARATUS FOR PROVIDING FUNCTION OF PORTABLE TERMINAL USING COLOR SENSOR - A method for providing a function of a portable terminal is provided. The method includes activating a color sensor upon execution of an application, displaying a color recognized by the color sensor on screen data corresponding to the executed application, and controlling a function based on a color recognized by the executed application. | 09-08-2011 |
20110215998 | PHYSICAL ACTION LANGUAGES FOR DISTRIBUTED TANGIBLE USER INTERFACE SYSTEMS - A system and a method are disclosed for a software configuration for use with distributed tangible user interfaces, in which the software is manipulated via a set of individual actions on individual objects, and in which such individual actions across one or more objects may be combined, simultaneously and/or over time, resulting in compound actions that manipulate the software. These actions and compound actions may be interpreted and acted upon by the software differently depending on its design, configuration, and internal state. | 09-08-2011 |
20110215999 | HAND-HELD ELECTRONIC DEVICE - A hand-held electronic device with a keyboard, thumbwheel, display and associated software is optimized for use of the device with the thumbs. The associated software has a plurality of features to optimize efficient use of the limited keyboard space and encourage the use of the device by thumb-based data entry through the thumbwheel and/or through a combination of minimal number of keystrokes. Software features include international character scrolling, and auto-capitalization. The keys on the device keyboard are optimally shaped and configured for thumb-based input. In addition, the thumbwheel is inclined between the front and a side edge of the device so as to be reachable by either the thumb or index finger of the user's hand at the side edge of the device. | 09-08-2011 |
20110221664 | VIEW NAVIGATION ON MOBILE DEVICE - Users may view web pages, play games, send emails, take photos, and perform other tasks using mobile devices. Unfortunately, the limited screen size and resolution of mobile devices may restrict users from adequately viewing virtual objects, such as maps, images, email, user interfaces, etc. Accordingly, one or more systems and/or techniques for displaying portions of virtual objects on a mobile device are disclosed herein. A mobile device may be configured with one or more sensors (e.g., a digital camera, an accelerometer, or a magnetometer) configured to detect motion of the mobile device (e.g., a pan, tilt, or forward/backward motion). A portion of a virtual object may be determined based upon the detected motion and displayed on the mobile device. For example, a view of a top portion of an email may be displayed on a cell phone based upon the user panning the cell phone in an upward direction. | 09-15-2011 |
20110221665 | REMOTE CONTROLLER AND CONTROL METHOD THEREOF, DISPLAY DEVICE AND CONTROL METHOD THEREOF, DISPLAY SYSTEM AND CONTROL METHOD THEREOF - A display system, a display device, and a remote controller for controlling the display device are provided. The remote controller includes: a touch screen which receives an input from a user and display a first manipulation UI group including a shortcut key corresponding to a plurality of buttons to control the display device; a signal output unit which outputs a control signal to the display device based on an input to the touch screen; and a controller which, in response to a user's selection of the shortcut key, displays on the touch screen a second manipulation UI group and parts of the first manipulation UI group, the second manipulation UI group displaying the plurality of buttons. | 09-15-2011 |
20110221666 | Methods and Apparatus For Gesture Recognition Mode Control - Computing devices can comprise a processor and an imaging device. The processor can be configured to support both a mode where gestures are recognized and one or more other modes during which the computing device operates but does not recognize some or all available gestures. The processor can determine whether a gesture recognition mode is activated, use image data from the imaging device to identify a pattern of movement of an object in the space, and execute a command corresponding to the identified pattern of movement if the gesture recognition mode is activated. The processor can also be configured to enter or exit the gesture recognition mode based on various input events. | 09-15-2011 |
20110221667 | APPARATUS AND METHOD FOR SWITCHING SCREEN IN MOBILE TERMINAL - An apparatus and a method for switching an output screen direction of a mobile terminal from a vertical direction to a horizontal direction, or from the horizontal direction to the vertical direction when needed are provided. The apparatus includes a sensor unit and a rotation determining unit. The sensor unit detects a direction of a magnetic field. The rotation determining unit determines a movement of the mobile terminal that rotates on a plane through the direction of the magnetic field detected by the sensor unit. | 09-15-2011 |
20110221668 | PARTIAL VIRTUAL KEYBOARD OBSTRUCTION REMOVAL IN AN AUGMENTED REALITY EYEPIECE - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content. The displayed content is an interactive control element. An integrated camera facility of the eyepiece images a user's body part as it interacts with the interactive control element, wherein the processor removes a portion of the interactive control element by subtracting the portion of the interactive control element that is determined to be co-located with the imaged user body part based on the user's view. | 09-15-2011 |
20110221669 | GESTURE CONTROL IN AN AUGMENTED REALITY EYEPIECE - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content. The eyepiece includes an integrated camera facility that images a gesture, wherein the integrated processor identifies and interprets the gesture as a command instruction. | 09-15-2011 |
20110221670 | METHOD AND APPARATUS FOR VISUAL BIOMETRIC DATA CAPTURE - A method and apparatus for visual biometric data capture are provided. The apparatus includes in interactive head-mounted eyepiece worn by a user that includes an optical assembly through which a user views a surrounding environment and displayed content. The optical assembly comprises a corrective element that corrects the user's view of the surrounding environment and an integrated processor for handling content to the user. An integrated optical sensor captures visual biometric data when the eyepiece is positioned so that a nearby individual is proximate to the eyepiece. Visual biometric data is captured using the eyepiece and is transmitted to a remote processing facility for interpretation. The remote processing facility interprets the captured visual biometric data and generates display content based on the interpretation. This display content is delivered to the eyepiece and displayed to the user. | 09-15-2011 |
20110221671 | METHOD AND APPARATUS FOR AUDIO BIOMETRIC DATA CAPTURE - A method and apparatus for audio biometric data capture are provided. The apparatus includes in interactive head-mounted eyepiece worn by a user that includes an optical assembly through which a user views a surrounding environment and displayed content. The optical assembly comprises a corrective element that corrects the user's view of the surrounding environment and an integrated processor for handling content to the user. An integrated optical sensor captures visual biometric data when the eyepiece is positioned so that a nearby individual is proximate to the eyepiece. Audio biometric data is captured using multiple microphones mounted in an endfire array in the eyepiece. The remote processing facility interprets the captured audio biometric data and generates display content based on the interpretation. This display content is delivered to the eyepiece and displayed to the user. | 09-15-2011 |
20110221672 | HAND-WORN CONTROL DEVICE IN AN AUGMENTED REALITY EYEPIECE - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content. A control device worn on a hand of a user of the eyepiece includes a control component that when actuated by a digit of the hand of the user, provides a control command from the actuation of the control component to the processor as a command instruction. | 09-15-2011 |
20110221673 | SYSTEM AND METHOD FOR MONITORING A MOBILE COMPUTING PRODUCT/ARRANGEMENT - Described is a system and method for monitoring a mobile computing Arrangement. The arrangement may include a sensor and a processor. The sensor detects first data of an event including a directional orientation and a motion of the arrangement. The processor compares the first data to second data to determine if at least one predetermined procedure is to be executed. The second data may include a predetermined threshold range of changes in the directional orientation and the motion. If the predetermined procedure is to be executed, the processor selects the predetermined procedure which corresponds to the event as a function of the first data. Subsequently, the predetermined procedures is executed. | 09-15-2011 |
20110227819 | INTERACTIVE THREE-DIMENSIONAL DISPLAY SYSTEM AND METHOD OF CALCULATING DISTANCE - An interactive three-dimensional display system includes a three-dimensional display panel which has an optical sensor array, an interactive device which includes a projection light source and a shadow mask, and an image recognizing unit. The shadow mask has a pattern to define an image projected by the interactive device. The image is captured by the optical sensor array. The pattern includes two strip patterns which cross each other. The image includes two strip images which cross each other. The image recognizing unit is electrically connected with the optical sensor array and calculates relative positions of the interactive device and the three-dimensional display panel according to the image. A method of calculating the relative positions includes calculating according to the lengths of one of the strip patterns and one of the strip images, and a divergent angle and tilt angle of the projection light source. | 09-22-2011 |
20110227820 | LOCK VIRTUAL KEYBOARD POSITION IN AN AUGMENTED REALITY EYEPIECE - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content. The displayed content is an interactive control element. An integrated camera facility of the eyepiece images the surrounding environment, and identifies a user hand gesture as an interactive control element location command, wherein the location of the interactive control element remains fixed with respect to an object in the surrounding environment, in response to the interactive control element location command, regardless of a change in the viewing direction of the user. | 09-22-2011 |
20110227821 | ELECTRONIC BOOK WITH BUILT-IN CARD SCANNER - An electronic book includes a housing having a first body section and a second body section pivotable with respect to each other, the first and second body sections together defining a pair of internal faces and a pair of external faces; a spine member coupling the first and second body sections, the spine member defining a longitudinal cavity therethrough and a longitudinal slot providing entry into the longitudinal cavity; a card slot provided on one of the external faces; a flexible display screen extending across both internal faces; microprocessor circuitry positioned in the housing between the card slot and the display screen; and a card scanner mounted in the housing between the microprocessor circuitry and the card slot, the card scanner facing away from the display screen and configured to scan a card inserted into the card slot and to convert a two-dimensional pattern on the card into data signals. In a pivoted position where pair of internal faces oppose each other, a middle portion of the flexible display screen extends into the longitudinal cavity of the spine member via the longitudinal slot to form a loop. | 09-22-2011 |
20110227822 | FLEXIBLE DEVICES AND RELATED METHODS OF USE - Disclosed are devices which include flexible display sheets or other flexible elements, whereas by physically manipulating said flexible display sheets or elements, interaction with said device may be facilitated. Flexibility features may be employed to provide methods of interaction which include manipulating flexible section. Some of said methods of interaction are disclosed. Further disclosed are units which can connect to flexible display sheets, for interacting with said flexible display sheets. | 09-22-2011 |
20110234480 | AUDIO PREVIEW OF MUSIC - Systems, methods, and machine-readable media are disclosed for providing an audio preview of songs and other audio elements. In some embodiments, an electronic device may operate in either a “play mode,” which allows a user to listen to songs in a normal fashion, or in a “preview mode,” which may be used to provide previews of songs to a user in succession. In some embodiments, the electronic device may seamlessly transition between play mode and preview mode. For example, the electronic device may pause a currently-playing song when the mode of operation switches from play mode to preview mode so that, if the user exits out of preview mode, the original song may be resumed from the pause point. In some embodiments, the electronic device may provide a multi-directional visual interface that allows a user to control the succession of previews provided in preview mode. | 09-29-2011 |
20110234481 | ENHANCING PRESENTATIONS USING DEPTH SENSING CAMERAS - A depth camera and an optional visual camera are used in conjunction with a computing device and projector to display a presentation and automatically correct the geometry of the projected presentation. Interaction with the presentation (switching slides, pointing, etc.) is achieved by utilizing gesture recognition/human tracking based on the output of the depth camera and (optionally) the visual camera. Additionally, the output of the depth camera and/or visual camera can be used to detect occlusions between the projector and the screen (or other target area) in order to adjust the presentation to not project on the occlusion and, optionally, reorganize the presentation to avoid the occlusion. | 09-29-2011 |
20110234482 | TECHNIQUES FOR INTERPRETING SIGNALS FROM COMPUTER INPUT DEVICES - The present invention features determining, from a plurality of actions, an action event corresponding to multiple segments of input data received from a computer input device, defining a corresponding action, based upon an order in which the multiple segments of input are received. Access is provided to the event through a program interface. Also disclosed is a system that carries-out the functions of the method, as well as a computer-program product includes computer-readable instructions that causes a processor of a computer system to carry-out the functions of the method. | 09-29-2011 |
20110234483 | GAME CONTROLLER GLOVE - A game controller glove includes a main body, a MEMS, and a power supply. The main body includes five finger portions for receiving fingers of a game player's hand. The MEMS includes finger-movement sensors, MEMS sensors, and a processor. The finger-movement sensors are positioned at the finger portions and used for detecting the movements of the fingers. The MEMS sensors are connected to the corresponding finger-movement sensors and used for sensing pressures applied by the corresponding finger-movement sensors, and converting the pressure into electrical signals. The processor is electrically connected to the MEMS sensors and used for obtaining the electrical signals and then restores the electrical signals back to pressure values. The power supply is used for supplying electrical power to the processor. | 09-29-2011 |
20110234484 | OPERATION INPUT UNIT AND MANIPULATOR SYSTEM - Unintended motion of a displayed object is prevented by rapidly detecting a state in which an operator becomes unable to operate an operating unit. Provided is an operation input unit including a display; a head-mounted unit; an operating unit to which an operating signal for a displayed object displayed on the display is input; a relative-position detecting section that detects the relative position between the head-mounted unit and the operating unit; and a control unit that controls the displayed object by switching between a first control mode in which the motion of the displayed object is controlled in accordance with an operating signal input to the operating unit and a second control mode in which the motion of the displayed object is controlled by limiting an operating signal input to the operating unit on the basis of the relative position detected by the relative-position detecting section. | 09-29-2011 |
20110234485 | POSITION DETECTING DEVICE AND POSITION INPUT DEVICE - A position detecting device is provided, which is configured to minimize leakage of magnetic flux in an electromagnetic induction system. The position detecting device includes: a sensor unit including a plurality of first loop coils arranged in a first direction and a plurality of second loop coils arranged in a second direction intersecting with the first direction; a yoke sheet provided on a side of the sensor unit that is opposite to a side that faces a position indicator; an auxiliary loop coil provided at a corner part of the sensor unit; a signal transmitter configured to transmit a signal to one of the coils in order to generate a magnetic field to induce an induced current in a coil of the position indicator; and a controller configured to select one of the coils, and to control whether to transmit a signal from the signal transmitter to the selected one of the coils or to make the selected one of the coils receive a signal from the position indicator. | 09-29-2011 |
20110234486 | Switching device and switching methods of the same - A switching device that selectively changes a computer to be operated from multiple computers including a control unit that the control unit detects a cursor position on the computer to be operated based on coordinate data and a computer resolution of the computer to be operated, the coordinate data being generated by performing a same acceleration process as the computer to be operated, on relative coordinate data that has been acquired from a given pointing device, and the control unit selectively changing changes the computer to be operated according to the cursor position. It is thus possible to selectively change the computer to be operated without any dedicated software or requiring a given space for manipulation. | 09-29-2011 |
20110234487 | PORTABLE TERMINAL DEVICE AND KEY ARRANGEMENT CONTROL METHOD - A hold position detection unit for detecting a position held by an operator's hand is provided in at least both side portions of a terminal main body. And a display screen of a key group displayed on an operation display part is changed based on hold data indicating the position held by the operator's hand that is detected by the hold position detection unit. | 09-29-2011 |
20110234488 | PORTABLE ENGINE FOR ENTERTAINMENT, EDUCATION, OR COMMUNICATION - To simplify human-machine interaction, a portable interaction module includes multiple channels through which input is received. Different types of input mechanisms or sensors allow use of multiple techniques for capturing input, such as motion sensing, audio sensing, image tracking, image sensing, or physiological sensing. A fusion module included in the portable input device receives data from the input mechanisms or sensors and generates an input description identifying which input mechanisms or sensors receive data. The input description is communicated to a target device, which determines an output corresponding to the input description. Using multiple input capture techniques simplifies interaction with the target device by providing a variety of methods for obtaining input. | 09-29-2011 |
20110234489 | GRAPHICAL REPRESENTATIONS - There is provided a controller for a display device, the controller comprising a processor configured to receive an indication of the orientation of a user of the display device; receive an indication of the orientation of the display device; and generate a graphical representation of a user of the display device using the received indications, such that the orientation of the graphical representation of the user is adapted based on the orientation of the display device. | 09-29-2011 |
20110234490 | Predictive Determination - Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data. | 09-29-2011 |
20110241981 | Input Routing for Simultaneous USB Connections of Similar Device Types - Methods and devices for accommodating a plurality of interface devices via a Universal Serial Bus (USB) that include: (a) receiving a set of settings for an interface device; (b) generating an input/output (I/O) device handle associated with the interface device; (c) comparing the received interface device settings set with one or more entries of a device-matching criteria database; and (d) if the received set of interface device settings matches an entry of the device matching criteria database, then: (i) generating a device manager handle associated with the interface device; and (ii) spawning a device manager thread based on the generated I/O device handle. | 10-06-2011 |
20110241982 | ELECTRONIC DEVICE CAPABLE OF AUTOMATICALLY ADJUSTING FILE DISPLAYED ON DISPLAY AND METHOD THEREOF - An adjusting method for adjusting a file includes determining a position of a first point of focus on the display and recording the content displayed at the position of the first point of focus when the electronic device is in a steady state, determining a position of a second point of focus according to the movement of the electronic device and the position of the first point of focus on the display, and adjusting the content by moving the content displayed on the first point of focus to the position of the second point of focus. An electronic device for implementing the adjusting method is also provided. | 10-06-2011 |
20110241983 | DISPLAY DEVICE WITH PRIVACY FUNCTION, AND METHOD FOR PROVIDING PRIVACY TO A DISPLAY DEVICE - A method is adapted for providing privacy to a display device. The display device is adapted to display a normal image that can be viewed in a user area within which a user is situated. The method includes the steps of: a) configuring the display device to generate information of an interference image; b) configuring a display panel of the display device to display the normal image and the interference image; and c) configuring the display device to direct the interference image toward areas outside the user area. Accordingly, the user situated in the user area is able to view the normal image, while a viewer outside the user area is able to view the interference image, thereby achieving a privacy effect. | 10-06-2011 |
20110248910 | METHOD FOR COMPARING TWO CONTINUOUS COMPUTER GENERATED LINES GENERATED BY THE MOVEMENTS OF A COMPUTER MOUSE OR A DIGITIZER TABLET - A method is provided for comparing two continuous computer generated lines by using a combination of time, area, velocity and angular velocity and to use a weighted distribution of 10 for time, 40 for area, 40 for velocity and 10 for angular velocity. The method returns true if the sum of the product of the weighted distribution with the percentage difference for time, area, velocity, and angular velocity is greater than or equal to 80, otherwise, the method returns false. | 10-13-2011 |
20110248911 | STEREOSCOPIC IMAGE DISPLAY SYSTEM AND METHOD OF CONTROLLING THE SAME - The present invention discloses a stereoscopic image display system and a method of controlling the same. An eye tracking module locates current 3D spatial positions of the viewer's eyes, and generates the information of both left and right eyes' current 3D spatial positions. A control module controls a display device that can alter the direction of the light outputted, and outputs images on the display device in time multiplex mode. The light containing the left eye image is outputted to the position of left eye instead of right eye at one time point, and the light containing the right eye image is outputted to the position of right eye instead of left eye at another time point, so that a stereoscopic image is perceived according to the parallax theory. The present invention enlarges the visual range of stereoscopic image and achieves a better stereoscopic image visual experience for viewers. | 10-13-2011 |
20110248912 | DISPLAY DEVICE HAVING THE CONNECTION INTERFACE WITH ILLUMINING AND INDICATING FUNCTIONS - A display device includes a connection interface and a light source. The light source illuminates the connection interface and indicates I/O ports of the connection interface. The light source can illuminate all I/O ports of the connection interface or the I/O port of the connection interface corresponding to the input signal source selected by the user. | 10-13-2011 |
20110248913 | MOTIONBEAM INTERACTION TECHNIQUES FOR HANDHELD PROJECTORS - An image projection system may be configured to project objects which respond to movements and gestures made using a handheld projector, as well as to methods for controlling the projected objects based on such user input. For example, users may interact with and control objects in a projection frame by moving and/or gesturing with the handheld projector. Further, objects or characters projected using the handheld projector may be configured to perceive and react to physical objects in the environment. Similarly, elements of the physical environment may be configured to respond to the presence of the projected objects or characters in a variety of ways. | 10-13-2011 |
20110248914 | System and Method for Virtual Touch Typing - Systems, methods, and products are described for enabling a user to enter data into a device without a keyboard. A virtual keyboard in accordance with the invention may include a sensor to detect actual or intended finger movements or other changes in the user's physiology, an element that generates a sequence of ambiguous pseudo-words based on the physiological changes, and a translator that translates the pseudo-words into words in a natural language and provides the natural language words to the device or to a data storage unit. The device typically may be a computer, electronic notepad, personal digital assistant, telephone, or other electronic device. | 10-13-2011 |
20110248915 | METHOD AND APPARATUS FOR PROVIDING MOTION LIBRARY - A method and an apparatus for providing a motion library, adapted to a service end device to provide a customized motion library supporting recognition of at least one motion pattern for a user end device. At least one sensing component disposed on the user end device is determined. At least one motion group is determined according to the determined sensing components, wherein each motion group comprises at least one motion pattern. The at least one motion pattern is selected and a motion database to is queried to display a list of the motion groups corresponding to the selected motion patterns and the motion groups are selected from the list. The motion patterns belonging to the motion groups are selected to re-compile the customized motion library, which is provided for the user end device, so as to enable the user end device to recognize the selected motion patterns. | 10-13-2011 |
20110254760 | Wireless Motion Processing Sensor Systems Suitable for Mobile and Battery Operation - The present invention relates to a combination of a 6-axis motion sensor having a 3-axis gyroscope and a 3-axis linear accelerometer, a motion processor and a radio integrated circuit chip (IC), wherein the intelligence in the motion processor enables the communication between the motion sensor, the radio IC and the external network. The motion processor also enables power savings by adaptively controlling the data rate of the motion sensor, depending on the amount or speed of the motion activity. | 10-20-2011 |
20110254761 | OPTICAL NAVIGATION DEVICES - An optical navigation device, such as that used on a computer or mobile communications device, includes a radiation source capable of producing a beam of radiation. A sensor receives an image. An optical element identifies movement of an elastic object on a first surface to thereby enable a control action to be carried out. The device further determines the relative pressure on a first surface by an elastic object based upon the value of an optical parameter, such as the average radiation intensity of the image received at the sensor. The device may be arranged to operate as a push button or a linear pressure sensor. | 10-20-2011 |
20110254762 | MACHINE INTERFACES - Apparatus for determining the movement of an object comprises a plurality of ultrasonic transducers | 10-20-2011 |
20110260961 | SERIES CONNECTED ELECTROCHROMIC DEVICES - An electrochromic device includes a first electrochromic region interconnected with a second electrochromic region by a plurality of conductive links disposed between sides of a substrate on which the material layers of the electrochromic device are formed. The plurality of conductive links interconnects a first isolated conductive region of the first electrochromic region with a first isolated conductive region of the second electrochromic region. A sequence of a counter electrode layer, an ion conductor layer and an electrochromic layer is sandwiched between the first conductive regions of the first and second electrochromic regions and respective second isolated conductive regions of the first and second electrochromic regions. The second conductive regions of the first and second electrochromic regions are connected to respective first and second bus bars which are for connection to a low voltage electrical source. | 10-27-2011 |
20110260962 | INTERFACING WITH A COMPUTING APPLICATION USING A MULTI-DIGIT SENSOR - A technology is described for interfacing with a computing application using a multi-digit sensor. A method may include obtaining an initial stroke using a single digit of a user on the multi-digit sensor. A direction change point for the initial stroke can be identified. At the direction change point for the initial stroke, a number of additional digits can be presented by the user to the multi-digit sensor. Then a completion stroke can be identified as being made with the number of additional digits. A user interface signal to can be sent to the computing application based on the number of additional digits used in the completion touch stroke. In another configuration of the technology, the touch stroke or gesture may include a single stroke where user interface items can be selected when additional digits are presented at the end of a gesture. | 10-27-2011 |
20110260963 | SYMBOLIC INPUT VIA MID-AIR FINGER/THUMB MOTIONS - An information system includes primary inductance coils driven by an energy source, secondary inductance coils, a processor, a display device, and a lookup table. The processor determines an inductance value generated when coils are brought into proximal interaction with each other. The processor extracts associates alphanumeric or other symbolic information from the lookup table, and transmits the symbolic information to the display device. A circuit is also provided that generates the symbolic information for presentation via the display device, and includes an electrically-driven inductance coil positionable on a thumb of a user, passively-driven inductance coils positionable on the various phalanges of the user's fingers, the processor, and the lookup table. A method for generating and recording symbolic information includes determining the inductance value, associating the inductance value with corresponding symbolic information in the lookup table, and transmitting the symbolic information to a display device. | 10-27-2011 |
20110260964 | METHOD AND APPARATUS FOR CONTROLLING A DISPLAY TO GENERATE NOTIFICATIONS - The present specification provides a method and apparatus for controlling a display based on signals received from one or more input devices. In one implementation, a mobile device with a touch screen and a touch pad is provided. A notification module executable on the mobile device configures the processor of the mobile device to control the display to generate a notification bar and a content region. The notification bar contains an icon representing each application from which a notification has been generated and a number adjacent the icon for indicating how may notifications have been generated by the application. The content region includes data associated with the notifications, and is arranged in rows beneath a header identifying the application. The layout of the applications may be varied as well as the priority of ordering the application sin the content region. | 10-27-2011 |
20110260965 | APPARATUS AND METHOD OF USER INTERFACE FOR MANIPULATING MULTIMEDIA CONTENTS IN VEHICLE - Disclosed are provided an apparatus and a method of a user interface for manipulating multimedia contents for a vehicle. An apparatus of a user interface for manipulating multimedia contents for a vehicle according to an embodiment of the present invention includes: a transparent display module displaying an image including one or more multimedia objects; an ultrasonic detection module detecting a user indicating means by using an ultrasonic sensor in a 3D space close to the transparent display module; an image detection module tracking and photographing the user indicating means; and a head unit judging whether or not any one of the multimedia objects is selected by the user indicating means by using information received from at least one of the image detection module and the ultrasonic detection module and performing a control corresponding to the selected multimedia object. | 10-27-2011 |
20110260966 | Fingerprint reader device and electronic apparatus - The present invention relates to a slide-type fingerprint reader device in which a finger is slid on the fingerprint sensor, and a positioning structure that causes a first joint portion of a finger to be positioned at the center point of the fingerprint sensor includes a sensor movement mechanism, which keeps the fingerprint sensor at a position bulging from a slide surface on which the finger is slid when the fingerprint sensor is not pressed down, and which allows the fingerprint sensor to move upon being pressed down. | 10-27-2011 |
20110260967 | HEAD MOUNTED DISPLAY - In a head mounted display, when it is determined that a hand of a user is in a field angle, the head mounted display starts the playing of main manual information. When a standard time elapses after starting the playing of the main manual information without the determination that the hand of the user is not included in the field angle, the playing of the main manual information is switched to the playing of sub manual information. When it is determined that the hand of the user is not included in the field angle in a state where the playing of the main manual or the sub manual is underway, the playing of the main manual information or the sub manual operation underway is finished. | 10-27-2011 |
20110267258 | IMAGE BASED MOTION GESTURE RECOGNITION METHOD AND SYSTEM THEREOF - An image based motion gesture recognition method and system thereof are disclosed. In embodiment, a hand posture detection is performed according to the received image frames, to obtain a first hand posture. It is then determined whether the first hand posture matches a predefined starting posture or not. If the first hand posture matches said predefined starting posture, movement tracking is performed according to hand locations on image frames, to obtain a motion gesture. During said movement tracking, the hand posture detection is performed according to said image frames to obtain a second hand posture, and it is determined whether the second hand posture matches a predefined ending posture. If the second hand posture matches the predefined ending posture, the movement tracking is stopped. Therefore, reduce complexity motion gesture recognition can be reduced and the reliability in interaction can be improved. | 11-03-2011 |
20110267259 | RESHAPABLE CONNECTOR WITH VARIABLE RIGIDITY - An accessory is disclosed for use in a human-computer interface gaming or other application. The accessory can be held or otherwise interacted with by a user, where the accessory is sensed and displayed as a virtual object on a display. The virtual representation of the accessory may be an accurate recreation of the accessory, or it may be displayed as a virtual object appropriate to the gaming or other application. The real world accessory is flexible and may be morphed into a variety of forms to better suit a wide array of uses. This accessory may also serve as a platform upon which other accessories can be mounted to enhance user experience. | 11-03-2011 |
20110267260 | INTERACTIVE DISPLAY APPARATUS AND OPERATING METHOD THEREOF - Provided are an interactive display apparatus and method for operating an interactive display apparatus, the apparatus including: a capturing unit operable to capture an image including a displayed image which is displayed on a display screen and a light of a pointing device irradiated onto the display screen; a memory which stores the captured image; a detection unit which determines coordinate information for the light of the pointing device irradiated onto the display screen based on the captured image; and a controller which determines whether the light of the pointing device is moved outside of the displayed image based on the coordinate information, and controls the apparatus to perform a predetermined operation from among a plurality of predetermined operations if the light of the pointing device is moved outside of the displayed image. | 11-03-2011 |
20110267261 | INPUT DEVICE AND CONTROL METHOD OF THE SAME - Disclosed are an input device, a method of controlling the input device and a system including the input device and a display device. The input device may include: a communication unit operable to communicate with a display device; a sensor operable to sense a first signal containing noise and a scan signal generated from the display device, and a second signal containing the noise; a noise eliminator operable to compare the first signal and the second signal; and a controller operable to control the communication unit to output position information of the input device, wherein the position information is based on the comparing of the first signal and the second signal. | 11-03-2011 |
20110267262 | Laser Scanning Projector Device for Interactive Screen Applications - One embodiment of the device comprising: (i) a laser scanning projector that projects light on a diffusing surface illuminated by the scanning projector; (ii) at least one detector that detects, as a function of time, the light scattered by the diffusing surface and by at least one object entering area illuminated by the scanning projector; and (iii) an electronic device capable of (a) reconstructing, from the detector signal, an image of the object and of the diffusing surface and (b) determining variation of the distance between the object and the diffusing surface | 11-03-2011 |
20110267263 | CHANGING INPUT TOLERANCES BASED ON DEVICE MOVEMENT - Movement of a device is detected using at least one sensor. In response to the detected movement, at least one value is altered to make it easier for a user to select an object on a display. | 11-03-2011 |
20110273368 | Extending Digital Artifacts Through An Interactive Surface - A unique system and method that facilitates extending input/output capabilities for resource deficient mobile devices and interactions between multiple heterogeneous devices is provided. The system and method involve an interactive surface to which the desired mobile devices can be connected. The interactive surface can provide an enhanced display space and customization controls for mobile devices that lack adequate displays and input capabilities. In addition, the interactive surface can be employed to permit communication and interaction between multiple mobile devices that otherwise are unable to interact with each other. When connected to the interactive surface, the mobile devices can share information, view information from their respective devices, and store information to the interactive surface. Furthermore, the interactive surface can resume activity states of mobile devices that were previously communicating upon re-connection to the surface. | 11-10-2011 |
20110279359 | SYSTEMS AND METHODS FOR MONITORING MOTION SENSOR SIGNALS AND ADJUSTING INTERACTION MODES - Described herein are systems and methods for recognizing when a user of an interactive application is frustrated and for responding to the user's frustration by changing an interaction mode. Signals arising from motion sensors included in user equipment are monitored for patterns indicative of user frustration. In response to detecting a frustration pattern in a motion sensor signal, an interactive application display is changed. | 11-17-2011 |
20110279360 | IMAGE DISPLAY UNIT AND IMAGE FORMING APPARATUS INCLUDING THE SAME - An image display unit capable of displaying image data page-wise includes: a scanner portion; a display panel for displaying input image data in preview representation; an input condition determiner that compares the image data successively input through the scanner portion, as to input condition and determines whether there is any change in the image data input condition; and a display controller that, when the input condition determiner determines that the input image data has changed in the input condition, makes control such as to display the image data that was determined to have changed in the input condition and the image data input immediately before the image data in question, together on the display panel. | 11-17-2011 |
20110279361 | OPTICAL DETECTION DEVICE, DISPLAY DEVICE, AND ELECTRONIC APPARATUS - An optical detection device includes: a light source unit that emits source light; a curve-shaped light guide that includes: a light incident surface to which the source light is incident, the light incident surface being located in an end portion of the light guide; and a convex surface from which the source light received by the light incident surface is output; an emitting direction setting unit that receives the source light output from the convex surface of the light guide and sets an emitting direction of emitting light to a direction of a normal line of the convex surface; a light receiving unit that receives reflection light acquired by reflecting the emitting light off an object; and a detection unit that detects at least a direction in which the object is located based on the light reception in the light receiving unit. | 11-17-2011 |
20110279362 | DISPLAY DEVICE AND DISPLAY METHOD - A display device and the like which can provide an image showing the status of the past presentation more flexibly is provided. A display device includes an image generating section generating an instruction image which reflects an instruction content based on presentation data obtained by relating image data showing a displayed image to display time data showing a display time of the displayed image and based on instruction information showing the instruction content, a display section displaying the instruction image, and an updating section updating the presentation data based on the instruction, and the image generating section generates a reproduction target time specifying image including a time region which changes as time passes and a specifying region which moves on the time region according to an instruction position and shows a reproduction target time, and the display section displays the reproduction target time specifying image. | 11-17-2011 |
20110279363 | IMAGE FORMING APPARATUS AND DISPLAY CONSOLE DISPLAYING PREVIEW IMAGE - In order to provide a technique allowing easy confirmation of image contents even if the number of images increases, a display console includes a display device having an image displaying function and a display control unit controlling the display by dividing a display screen of the display device into an image preview area and another area. The display control unit switches, in accordance with a user instruction, between a fit-to-screen screen image in which area ratio between the preview area and another area has a first value, and a finish preview screen image, an image edition mode screen image, or a document display mode screen image, in which the size of another area is made smaller and the size of preview area is made larger than in the fit-to-screen screen image. | 11-17-2011 |
20110279364 | Information Display Apparatus with Proximity Detection Performance and Information Display Method Using the Same - An information display apparatus with proximity detection performance contains a display device that displays image information, a sensor constituted of plural detection electrodes, and an adjusting device of detection resolution that adjusts the detection resolution to be detected based on a distance between the sensor and an object that is contacted to any one of the detection electrodes. | 11-17-2011 |
20110285618 | ACTIVE INTERFACE CONTROLS HAVING BI-STABLE ACTUATION AND INTRINSIC SENSING CAPABILITY - An active interface control shiftable between deployed and stowed configurations, and including an active material actuator employing a bi-stable mechanism configured to increase the actuator stroke length, and/or presenting intrinsic or external sensing capability, and reconfigurable displays comprising separately shiftable sets of controls. | 11-24-2011 |
20110285619 | Method for identifying a sequence of input signals - A method for identifying a sequence of input signals is proposed, a first input signal being identified in a first method step, a second input signal being identified within a predefined reference time in a second method step, and a time interval between the first input signal and the second input signal being determined in a third method step and furthermore the adapted reference time being set as a function of the time interval in a fourth method step. | 11-24-2011 |
20110285620 | GESTURE RECOGNIZER SYSTEM ARCHITECTURE - Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data. | 11-24-2011 |
20110291922 | Systems and Methods For Automatic Disable of Input Devices - Systems, methods, apparatuses and computer program products configured to provide intelligent filtering techniques to reduce errant device inputs are described. For example, filtering out the data input from a touch pad while typing on the keyboard or operating with a pointing stick, even if the touch pad is sensing contact, is augmented by continuing to filter data from the touch pad until the contact on the touch pad ends. | 12-01-2011 |
20110291923 | APPLICATION DEVICE OF ELECTRONIC PAPER SOFTWARE - The present invention relates to an application device of electronic paper software, generally including a first flexible display module, a second flexible display module, and a flexible input module, wherein the second flexible display module is formed on one side of the first flexible display module and the flexible input module extends from one side of the first flexible display module or the second flexible display module. The first flexible display module and the second flexible display module may display various and different use interfaces and display interfaces to thereby provide a digital product that is of low power consumption, light-weight and compactness, water resistance, vibration resistance, and being hard to break. | 12-01-2011 |
20110298698 | MANUAL HUMAN MACHINE INTERFACE OPERATION SYSTEM AND METHOD THEREOF - A manual human machine interface operation system and method thereof are disclosed. In embodiment, this manual human machine interface operation system extracts user's arm image and palm image from the images capture by at least two cameras, and then calculates user's arm coordinate, so that user can select the object shown on the human machine interface by manually using his/her arm. The system then recognizes a hand posture according to the palm image and determines an instruction mapped to this hand posture. This instruction is then performed on the selected object. In embodiment, the hand posture can be a pointing posture, grab posture or a release posture. | 12-08-2011 |
20110298699 | INPUT APPARATUS, INFORMATION PROCESSING APPARATUS, OPERATION INPUT METHOD, AND SENSOR SHEET - Provided is an input apparatus including an input operation section, a capacitive sensor, and an output section. The input operation section includes an operation member configured to mechanically receive an input operation, and a first detection circuit configured to detect the input operation of the operation member. The capacitive sensor includes a plurality of electrodes which are arranged around the operation member, and each of which has a capacitance variable due to an approaching of a detection target, and a second detection circuit configured to detect the capacitance of each of the plurality of electrodes. The output section is configured to output an output of the first detection circuit and an output of the second detection circuit. | 12-08-2011 |
20110298700 | OPERATION TERMINAL, ELECTRONIC UNIT, AND ELECTRONIC UNIT SYSTEM - An operation terminal includes: a posture detection section detecting a posture of the operation terminal, a change of the posture, or both thereof; a mode selection section selecting, based on a detection result of the posture detection section, an operation mode from a plurality of operation modes including a gesture mode and a non-gesture mode; and a transmission section sending a control command corresponding to the detection result of the posture detection section to an electronic unit when the gesture mode is currently selected. | 12-08-2011 |
20110298701 | Presenting Information to a User Based on the Current State of a User Device - Information is presented to a user based on a current state of an end-user device (e.g., a mobile phone). In one embodiment, a method includes: detecting, via a user device, a predefined user motion of a user (e.g., a flick of a trackball or gesture on a touch screen); determining a current state of the user device based on at least one characteristic; and in response to detecting the user motion, presenting, via a display of the user device, information (e.g., a person profile) to the user based on the current state. | 12-08-2011 |
20110298702 | USER INTERFACE DEVICE AND INPUT METHOD - Provided is a user interface device ( | 12-08-2011 |
20110298703 | INFORMATION PROCESSING DEVICE AND COMPUTER READABLE RECORDING MEDIUM - An information processing device that is connected to a projecting device that projects an annotation image input from an external, terminal a projection area including an object and a background, and is connected to an image capture device that captures an image of the projection area including the object and the background, includes: a detecting unit that detects movement of the object from an image captured by the image capture device; an extracting unit that extracts a changed region that is caused in the captured image by the movement of the object; and a processing unit that performs processing on at least one of the captured image and the annotation image, when the annotation image exists in the changed region. | 12-08-2011 |
20110298704 | THREE-DIMENSIONAL IMAGING AND DISPLAY SYSTEM - A three-dimensional imaging and display system is provided in which user input is optically detected in an imaging volume by measuring the path length of an amplitude modulated scanning beam as a function of the phase shift thereof. Visual image user feedback concerning the detected user input is presented. | 12-08-2011 |
20110298705 | THREE-DIMENSIONAL INPUT CONTROL DEVICE - Some embodiments provide force input control devices for sensing vector forces comprising: a sensor die comprising: a rigid island, an elastic element coupled to the rigid island, die frame coupled to a periphery of the elastic element, one or more stress sensitive components on the elastic element, and signal processing IC, where the sensor die is sensitive to a magnitude and a direction of a force applied to the rigid island within the sensor die, where the sensor die is coupled electrically and mechanically to a substrate, a spring element coupling an external button, where the force is applied, to the rigid island element, wherein the spring element has a flat geometry and located in a plane parallel to a plane of the substrate, where the spring element is configured to translate a deflection of the button into an allowable force applied to the rigid island. | 12-08-2011 |
20110304530 | 2D/3D IMAGE SWITCHING DISPLAY DEVICE - A 2D/3D image switching display device includes an image display unit and an image switching unit coupled to the image display unit. The image switching unit includes first and second transparent substrates and first and second transparent conducting elements installed on the first and second transparent substrates respectively. An electrochromic layer and an electrolytic layer are formed on the first and second transparent substrates sequentially. The electrochromic layer produce a color change according to the switching status of the image display unit After a stereo image divided into left and right eye images is received by naked eyes, no moire pattern will be produced, so that no additional light shielding device using a parallax barrier is required for displaying stereo images, and the 2D/3D image switching display device can change a light-shielding angle for adjusting a stereo image display according to the viewing angle. | 12-15-2011 |
20110304531 | METHOD AND SYSTEM FOR INTERFACING AND INTERACTION WITH LOCATION-AWARE DEVICES - A system and a method of using the system includes a motion detection subsystem for detecting a motions applied to a computing device about one or more axes. A storage subsystem stores motion command definitions. A motion processing subsystem is included for characterizing the motions, retrieving command definitions, comparing the characterized motions with the retrieved command definitions, and retrieving commands associated with matched command definitions. A command processing subsystem is included for defining new motions and storing new characterized motions as entries in the command definition, retrieving stored characterized motions and storing named characterized motions as entries in the command definitions, associating commands with stored characterized motions and storing the associated commands as entries in the command definitions, and processing retrieved commands for modification of and interaction with displayed information of the computing device and saving processing results. | 12-15-2011 |
20110304532 | Image capturing and display apparatus and method - Provided is a display apparatus and method. The display apparatus may sense light reflected from an object and passed through a display panel, and may control a power of a backlight unit depending on whether the light has passed through the display panel. | 12-15-2011 |
20110304533 | STEREOSCOPIC IMAGE DISPLAY DEVICE - Disclosed is a stereoscopic image display device which detects movement of moving viewers from among multiple viewers and enables the multiple viewers to observe a stereoscopic image even if the moving viewers change positions, the stereoscopic image display device includes a display panel corresponding to one switchable region to emit two-dimensional images, the number of which is more than the number of N views (N being a natural number over 3), a switchable panel located on the display panel to convert the two-dimensional images into three-dimensional images and to emit the three-dimensional images when voltage is applied thereto, a detection unit to detect movement of moving viewers from among multiple views and final positions of the moving viewers, and a control unit to output a control signal to shift the views of the two-dimensional images according to the movement and the final positions of the moving viewers. | 12-15-2011 |
20110304534 | WRITING STROKE RECOGNITION APPARATUS, MOBILE TERMINAL AND METHOD FOR REALIZING SPATIAL WRITING - A writing stroke recognition apparatus, a mobile terminal and a method for realizing spatial writing are provided. By acquiring and analyzing the writing movement amount information, the present invention utilizes the preset corresponding relationship between the movement amount information and stroke information, and the preset corresponding relationship between the movement amount information and the stroke relative position information, to obtain the writing stroke information and the writing relative position information. Further, the present invention recognizes the corresponding character by using the obtained writing stroke information and the writing relative position information. | 12-15-2011 |
20110310001 | DISPLAY RECONFIGURATION BASED ON FACE/EYE TRACKING - An adaptive interface system includes a user interface providing a visual output, a sensor for detecting a vision characteristic of a user and generating a sensor signal representing the vision characteristic, and a processor in communication with the sensor and the user interface, wherein the processor receives the sensor signal, analyzes the sensor signal based upon an instruction set to determine the vision characteristic of the user, and reconfigures the visual output of the user interface based upon the vision characteristic of the user to highlight at least a portion the visual output within a field of focus of the user. | 12-22-2011 |
20110310002 | FREE SPACE DIRECTIONAL FORCE FEEDBACK APPARATUS - A directional feedback device generating a directional force feedback in free space. The device includes a force generation structure including a rotatable mass creating a physical force vector in three dimensional space. A wireless communication device and a control system communicatively coupled to the wireless communication device and the force generation system are provided. The control system receives a definition of the physical force vector to be generated from an application executing in a processing device. The control system provides instructions to the force generation system to generate the physical force vector using the force generation structure. The force generation system, the wireless communication device and the control system are encloses in a housing. | 12-22-2011 |
20110310003 | IMAGE DISPLAY DEVICE AND METHOD OF DISPLAYING IMAGES - An image display device includes an autostereoscopic screen for simultaneously displaying a plurality of different images which are visible from in each case at least one of different laterally offset viewing zones and a control unit for controlling the screen in dependence on image information of the different images, wherein the screen has a matrix screen with a plurality of pixels arranged in columns and rows as well as a grating arranged in front of the matrix screen and having a structure orientated parallel to the columns to direct light emanating from the pixels of the matrix screen into the different viewing zones. The image display device furthermore has a tracking device for detecting two respective eye positions of at least two viewers of the screen, wherein the control unit is configured for inputting input commands. | 12-22-2011 |
20110310004 | APPARATUS AND METHOD FOR SETTING A PARAMETER VALUE - Embodiments of the present invention relate to an improved man-machine interface for an apparatus. The interface comprises at least one graphical representation of a controllable parameter. | 12-22-2011 |
20110310005 | METHODS AND APPARATUS FOR CONTACTLESS GESTURE RECOGNITION - Systems and methods are described for performing contactless gesture recognition for a computing device, such as a mobile computing device. An example technique for managing a gesture-based input mechanism for a computing device described herein includes identifying parameters of the computing device relating to accuracy of gesture classification performed by the gesture-based input mechanism and managing a power consumption level of at least an infrared (IR) light emitting diode (LED) or an IR proximity sensor of the gesture-based input mechanism based on the parameters of the computing device. | 12-22-2011 |
20110310006 | Automatic Calibration Of A Gaze Direction Algorithm From User Behavior - A method of calibrating the eye gaze direction of a user, the method including the steps of: (a) monitoring a user's eye gaze direction whilst carrying out a series of predetermined tasks, each of the tasks having an expected subject gaze direction; and (b) correlating the user's eye gaze direction with the expected direction for a statistically significant period of time; (c) calculating, from the correlating step, a series of likely eye gaze direction usage parameters associated with the user. | 12-22-2011 |
20110316767 | SYSTEM FOR PORTABLE TANGIBLE INTERACTION - Embodiments of the invention describe a system utilizing at least one camera and a display to create an object and context aware system. Embodiments of the invention may utilize the camera to sense a system's surroundings and use recognition logic or modules to detect and recognize objects on and around the system. System applications may further act on the sensed data and use the display of the system to provide visual feedback and interactive elements as a means to interact with the system user. | 12-29-2011 |
20110316768 | SYSTEM, METHOD AND APPARATUS FOR SPEAKER CONFIGURATION - An application for a device includes at least two speakers and at least two-channels of audio. The at least two-channels of audio are selectively routed to the at least two speakers. Routing of the audio to the speakers is made based upon an orientation of the device. The orientation of the device and therefore the speaker configuration is either manually changed by a viewer input or is automatically detected such as when a hand-held device is rotated, for example, to view a display of the device in portrait mode instead of landscape mode. | 12-29-2011 |
20110316769 | PROVIDING AN ALTERNATIVE HUMAN INTERFACE - Providing an alternative human interface for an electronic device when a current human interface is made ineffective by at least an environmental factor is described herein. By ineffective it is meant that the current human interface cannot maintain a minimum level of interactivity between a user and the electronic device in the current or anticipated environment. In addition to maintaining at least a threshold level of interactivity, the configuration of the alternative human interface can take into consideration other factors such as an expected operating state of the electronic device affected by the choice of alternative human interface. | 12-29-2011 |
20110316770 | Operation Device - To provide an operation device for use by a user by holding in one hand, comprising a thumb operating area defined on a surface of the enclosure of the operation device, where an operation member to be operated by the user, using his/her thumb, while holding the operation device is provided; an inclination operation member including a stick part projecting from the thumb operating area, for being operated by the user by inclining the stick part, using the thumb; and an operation button provided on the surface of the enclosure in a position opposed to the thumb operating area, being capable of being pressed by the user, using another finger, in a direction intersecting a direction toward the inclination operation member. | 12-29-2011 |
20110316771 | Display apparatus, television reception apparatus and pointing system - A display apparatus is connected to a computer that constitutes an external device. In at least one example embodiment, the external device outputs an image to the display apparatus via a video output port. When a pointing device directs a laser beam towards an image display module of the display apparatus, the display apparatus detects the laser beam using an incorporated photosensor, and identifies the coordinates in the image corresponding to that photosensor. Then, the location information for the identified coordinates is output to the external device via a pointing device input port. The external device recognizes the coordinate location and outputs a cursor indicating the pointer location superimposed on the output image. The display apparatus displays an image containing the cursor on the display screen. | 12-29-2011 |
20110316772 | INPUT METHOD EDITOR - Methods, systems, and apparatus, including computer program products, in which an input method editor receives input in a first writing system and presents input candidates in the first writing system or a second writing system. In one implementation, a method is provided. The method includes receiving input in a first writing system; presenting the input in the first writing system in a first interface element of an interface as content input; automatically identifying one or more candidates in a second writing system based on the input in the first writing system; and presenting the one or more candidates in the second writing system in a second interface element that is separate from the first interface element. | 12-29-2011 |
20120001843 | Mobile Device User Interface Change Based On Motion - Adapting a user interface of a mobile computing device when the mobile computing device is in a motion state is provided. Upon detecting that a mobile computing device is in motion by utilization of a location or motion determining system, such as a GPS navigation and/or accelerometer system, a motion mode UI may be activated on the device, wherein a display of device functionalities may be simplified by modifying one or more displayed elements of the device user interface. | 01-05-2012 |
20120001844 | Remote Control Systems and Methods for Activating Buttons of Digital Electronic Display Devices - A method for controlling an e-book reader, set forth by way of example and not limitation, includes transmitting a digital packet including an address of a button actuator and a button control signal in response to a detection of a button press on a remote control device. The method further includes receiving the packet at the button actuator, decoding the packet in a digital processor to derive the button control signal, and controlling a motor to move a physical actuator between a neutral position and a button press position. | 01-05-2012 |
20120001845 | System and Method for Virtual Touch Sensing - In view of existing mobile devices which have the limitation of relatively small area of the touch screen, the present invention describes a virtual touch sensing method based on computer vision technology. The method includes the steps of using more than one sensor to detect the coordinates of an indicator in a virtual touching area, and calculating the respective screen coordinates according to the coordinates of the indicator, where the area of the operation surface of the virtual touching area is independent to the area of the screen. The present invention also disclosed a corresponding virtual touch sensing system which provides a predictive control interface, where the area of the control interface is independent to the area of the actual screen. | 01-05-2012 |
20120001846 | INPUT DEVICE, WEARABLE COMPUTER, AND INPUT METHOD - A low-cost input device suitable for a wearable computer is provided. An input method suitable for operating a wearable computer is provided. A wearable computer including the input device is provided. An input device | 01-05-2012 |
20120001847 | Driving Method of Input/Output Device - A method for driving an input/output device, including: generating first data by putting a first region of a light unit in a lighted condition and a second region of the light unit in the lighted condition; generating second data by putting the first region in the lighted condition and the second region in an unlighted condition; generating third data by putting the first region in the unlighted condition and the second region in the lighted condition; generating fourth data by putting the first region in the unlighted condition and the second region in the unlighted condition; and generating difference data of either the first data or the third data and either the second data or the fourth data by using a data processor. | 01-05-2012 |
20120007796 | Flexible Apparatus - An apparatus including an elongate structure including integrated electronic circuitry providing at least an electronic user interface wherein the elongate structure is flexible and is configured to be flexed lengthwise by a user to form a looped configuration in which the elongate structure forms at least one lengthwise loop about an axis and in which at least one electrical connection for the electronic circuitry is formed where a first portion of the elongate structure and a second portion of the elongate structure contact. | 01-12-2012 |
20120007797 | HUMAN-MACHINE INTERFACE - A human interface suitable for being held is provided. The human interface includes a first housing, a second housing and a rotating shaft. The first housing has a first sidewall and a sliding chunk, and the sliding chunk is located at a side of the first sidewall. The second housing having a second sidewall and a sliding trough is disposed under the first housing, wherein the sliding trough is located at one side of the second sidewall. The sliding chunk is disposed in the sliding trough. The rotating shaft is pivotally connected between the first housing and the second housing and located at a side of the first housing apart from the sliding chunk. The sliding chunk of the first housing slides in the sliding trough of the second housing, and the first housing rotates relative to the second housing by using the rotating shaft as a rotating center. | 01-12-2012 |
20120007798 | ELECTRONIC DEVICE WITH PROMPT FUNCTION AND PROMPT METHOD THEREOF - An electronic device with a prompt function includes a display unit, a storage unit, a trigger detecting unit, and a processing unit. The storage unit stores a to-do list recording at least one item and at least one trigger condition. Each of the at least one item is associated with one of the at least one trigger condition. The trigger detecting unit is configured to receive input. The processing unit is configured to compare information of the received input with the at least one trigger condition. If there is a match, the processing unit displays the item associated with the one of the at least one trigger condition on the display unit, and outputs predetermined prompt content. A related prompt method is also provided. | 01-12-2012 |
20120007799 | APPARATUS, METHOD FOR MEASURING 3 DIMENSIONAL POSITION OF A VIEWER AND DISPLAY DEVICE HAVING THE APPARATUS - Disclosed herein are an apparatus, a method for measuring 3 dimensional positions of a viewer and a display device having the apparatus. The apparatus for measuring the 3 dimensional positions includes an image capturing module that photographs images included in objects; a detecting module that detects the objects from images photographed by the image capturing module and calculates sizes and coordinates of the images on the objects; and a position calculation module that calculates the 3-dimensional positions of the objects in the space in which the objects are positioned by using the information on the calculated sizes and coordinates of the image of the objects. | 01-12-2012 |
20120007800 | INTERACTIVE GLASSES SYSTEM - An interactive glasses system comprising a frame ( | 01-12-2012 |
20120007801 | EASILY DEPLOYABLE INTERACTIVE DIRECT-POINTING SYSTEM AND PRESENTATION CONTROL SYSTEM AND CALIBRATION METHOD THEREFOR - A method for controlling movement of a computer display cursor based on a point-of-aim of a pointing device within an interaction region includes projecting an image of a computer display to create the interaction region. At least one calibration point having a predetermined relation to said interaction region is established. A pointing line is directed to substantially pass through the calibration point while measuring a position of and an orientation of the pointing device. The pointing line has a predetermined relationship to said pointing device. Movement of the cursor is controlled within the interaction region using measurements of the position of and the orientation of the pointing device. | 01-12-2012 |
20120013528 | Distance measurment module, display device having the same, and distance measurement method of display device - A distance measurement module includes: an image pickup lens capturing an image of a subject; a light source unit disposed to be adjacent to the image pickup lens and irradiating reference light to the subject; a light receiving unit extracting image information of the subject and distance information to the subject upon receiving light which is reflected from the subject and made incident through the image pickup lens; and a calculation unit calculating the distance to the subject by using a phase difference between the reference light and the reflected light. | 01-19-2012 |
20120013529 | GESTURE RECOGNITION METHOD AND INTERACTIVE INPUT SYSTEM EMPLOYING SAME - A gesture recognition method comprises capturing images, processing the images to identify at least two clusters of touch points associated with at least two pointers, recognizing a gesture based on motion of the clusters, and updating a display in accordance with the recognized gesture. | 01-19-2012 |
20120019438 | DISPLAY DEVICE AND METHOD FOR ADJUSTING DISPLAY ORIENTATION THEREOF - A display device include a base, a display panel, a driving unit, an image capture unit, a feature detection unit, and a control unit. The display panel is rotatably mounted on the base. The driving unit rotates the display panel about a rotation axis relative to the base. The image capturing unit captures images of a scene in front of a front display surface of the display panel. The rotation axis of the display panel coincides with a central axis of the display panel. The feature detection unit detects a face portion of a user in the images, and determines a position of the face portion relative to the central line. The control unit controls the driving unit to rotate the display panel based on the position of the face portion so as to relocate the face portion in the images onto a central line of the image. | 01-26-2012 |
20120019439 | Universal Input Device and System - The present invention relates to input devices and particularly to input devices for use with computer and telecommunications systems and/or other object systems and/or devices. More particularly, the present invention relates to a universal input device for inputting data. | 01-26-2012 |
20120019440 | METHODS, APPARATUS, AND ARTICLE FOR FORCE FEEDBACK BASED ON TENSION CONTROL AND TRACKING THROUGH CABLES - A haptic interface system includes a cable based haptic interface device and a controller. The controller receives information related to movement of a grip in real-space and generates a stereoscopic output for a display device. The stereoscopic output includes images of a virtual reality tool whose motions mimic motions of the real-space grip. | 01-26-2012 |
20120019441 | MOBILE ELECTRONIC DEVICE - It is an object to provide a mobile electronic device capable of projecting a more serviceable and useful image. The mobile electronic device includes an operating unit, an image projector that projects an image toward a target object, and a control unit that controls at least an operation of the image projector based on an input to the operating unit. The mobile electronic device is configured that when information about the number of divisions of the target object is input from the operating unit, the control unit causes the image projector to project an image used to divide the target object into the number of divisions, and the task is thereby solved. | 01-26-2012 |
20120026077 | MAPPING TRACKPAD OPERATIONS TO TOUCHSCREEN EVENTS - In general, this disclosure describes techniques for mapping trackpad interactions and operations to touchscreen events without the use of a touchscreen user interface. In one example, a method includes receiving, via a trackpad device coupled to a computing device, touch-based input comprising one or more gestures, wherein the trackpad device is physically distinct from a display device coupled to the computing device. The method further includes determining, by the computing device, a trackpad operation based upon the touch-based input, and determining, by the computing device, a touchscreen event based upon a mapping of the trackpad operation to the touchscreen event, wherein the touchscreen event is determined without receiving any user input from a touchscreen device. The method further includes generating, by the computing device, the touchscreen event for processing by an application executing on the computing device, wherein the application is designed to process touchscreen events initiated by touchscreen devices. | 02-02-2012 |
20120026078 | Interactive Projector Device - An interactive device includes a first sensor, a first button, an accelerometer, a processor, and a transmitter. The first sensor is configured to receive coordinate information from a coordinate projection of a projector. The first button is configured to transmit an erase signal when the first button is pressed. The accelerometer is configured to provide angle information for the interactive device. The processor is in communication with the first sensor, with the first button, and with the accelerometer. The processor configured to receive the coordinate information from the first sensor, to receive the erase signal from the first button, to receive the angle information from the accelerometer, and to generate erase information based on the coordinate information, the erase signal, and the angle information. The transmitter is in communication with the processor, and is configured to transmit a delete request including the erase information received from the processor to the projector. | 02-02-2012 |
20120026079 | USING A DISPLAY ABSTRACTION TO CONTROL A DISPLAY - The disclosed embodiments relate to a system for controlling a display. This system includes a generic display-control interface which facilitates controlling the display, and a pluggable display-control module including code that implements a standardized set of display-control commands. The system also includes a plug-in framework that houses the pluggable display-control module and enables the generic display-control interface to communicate with the pluggable display-control module. In some embodiments, the system also includes a generic transport interface which facilitates communicating with the display, and a pluggable transport module including code that implements a standardized transport protocol. In these embodiments, the plug-in framework houses the pluggable transport module and enables the pluggable display-control module to communicate with the pluggable transport module. | 02-02-2012 |
20120026080 | ELECTRONIC DEVICE AND UNLOCKING METHOD THEREOF - An electronic device and a method enables an unlock operation of the electronic device. When the electronic device in a lock state is moved for the unlock operation, the electronic device receives a three-axis acceleration vector of the electronic device from an accelerometer. The electronic device analyzes three movement directions of the electronic device along three coordinate axes. The electronic device determines whether the analyzed three movement directions are the same as three predetermined movement directions along the three coordinate axes. If the analyzed three movement directions are the same as the three predetermined movement directions along the three coordinate axes, the electronic device is changed from the lock state to an unlock state. | 02-02-2012 |
20120026081 | SYSTEM AND METHOD FOR USING PAPER AS AN INTERFACE TO COMPUTER APPLICATIONS - A system and method for using paper to interface with handwritten annotations and/or pre-defined templates with one or more computer applications is disclosed. In one embodiment, the method includes imaging content in the paper including pre-defined handwritten commands, associated syntax, one or more computer application identifiers and pointed data which is already existing on the paper, analyzing the imaged content to identify the pre-defined handwritten commands, the one or more computer applications associated with the one or more computer application identifiers, the associated syntax and the pointed data, extracting the pointed data into a specified format associated with the one or more computer applications, executing the one or more computer applications based on the identified pre-defined handwritten commands, the one or more computer application identifiers and the associated syntax, and importing the extracted pointed data into the one or more executed computer applications. | 02-02-2012 |
20120026082 | DISPLAY DEVICE - In order to maintain a high visual image quality and save power, a display device includes: R sub-pixels, G sub-pixels, B sub-pixels, and W sub-pixels; and a human detection sensor ( | 02-02-2012 |
20120026083 | INTERFACE APPARATUS AND METHOD FOR CONTROLLING A DEVICE - A specific site of a user's body is detected from an input image, it is detected on the basis of a moving speed and a moving direction of the specific site whether the specific site makes a feeding motion in which the specific site moves in any direction, and when the feeding motion is detected, a control command for a device is changed. | 02-02-2012 |
20120026084 | SIGNALING DEVICE POSITION DETERMINATION - A system and method for providing user input to a device. A system includes a light source, a user positioned signaling device, an image capture device, and an image processor. The user positioned signaling device includes a retroreflective structure and a polarization retarder. The image capture device captures images of the signaling device. The image processor processes the captured images and determines a position of the signaling device based, at least in part, on light polarized and reflected by the signaling device. | 02-02-2012 |
20120026085 | IMAGE CONTRAST ENHANCEMENT IN DEPTH SENSOR - Embodiments related to the enhancement of contrast in an image pattern in a structured light depth sensor are disclosed. For example, one disclosed embodiment provides, in a structured light depth sensor system comprising a structured light depth sensor, a method comprising projecting a light pattern onto an object, detecting via an image sensor an image of the light pattern as reflected from the object, increasing a contrast of the light pattern relative to ambient light present in the image of the light pattern as reflected from the object to form a contrast-enhanced image of the light pattern as reflected from the object, and based upon a motion of the object as detected via the contrast-enhanced image of the light pattern, controlling an application that is providing output to a display. | 02-02-2012 |
20120032875 | Scanned Image Projection System Employing Beam Folding Apparatus - An imaging system ( | 02-09-2012 |
20120032876 | MEGA COMMUNICATION AND MEDIA APPARATUS CONFIGURED TO PROVIDE FASTER DATA TRANSMISSION SPEED AND TO GENERATE ELECTRICAL ENERGY - Disclosed embodiments comprise communication apparatus operatively configured with CMOS multiple antennas disposed on a chip for boosting communication signals to and for enabling faster data transmission speed and to provide interactive user interface. The communication apparatus is further configured to convert sound waves, vibrations, solar energy, wind force and pressure force into electrical energy communicable to a battery cell. Disclosed embodiment encompasses three modes of communications—the Cell phone, wireless Internet applications, and Global communication and media information. Embodiments provide communication apparatus operable to enhance mobile communication efficiency with touch sensitive display and provide energy harvesting platform on at least the housing for the apparatus and/or the circuit board configured with memories, processors, and modules. Embodiments provide advanced computing and media applications, including in-vehicle interactive communications and wireless Internet applications. Embodiments further provide a gaming device, a wireless media device configured with touch pads comprising sensors being embedded in silicon substrate and fused in micro fiber material having excellent electrical characteristics. Certain embodiments provide communication apparatus configured for voice enabled applications comprising human voice auditory operable to convert text into voice auditory and/or voice auditory into text applications. | 02-09-2012 |
20120032877 | Motion Driven Gestures For Customization In Augmented Reality Applications - A motion-driven user interface for mobile device-based augmented reality applications is described which provides a user with the ability to execute user interface input commands by physically manipulating the mobile device in space. The mobile device uses embedded sensors to identify the type and extent of the manipulation which cause execution of a corresponding user interface input command which can vary depending upon the operating context of the mobile device. | 02-09-2012 |
20120032878 | INPUT APPARATUS USING A CONDUCTIVE RUBBER MEMBER - A data input apparatus using a conductive rubber member is provided. The apparatus includes a conductive rubber member, to one end of which voltage is input and though the other end of which voltage reduced in proportion to the internal resistance and the length thereof is output; a voltage output member which is brought into contact with the conductive rubber member to output the voltage value of the conductive rubber member at the contact point; and a control unit which recognizes the contact point based on the voltage value input from the voltage output member, extracts data corresponding to the contact point from a memory unit, and inputs the extracted data. | 02-09-2012 |
20120032879 | Method, Apparatus, and Article for Force Feedback Based on Tension Control and Tracking Through Cables - A haptic device for human/computer interface includes a user interface tool coupled via cables to first, second, third, and fourth cable control units, each positioned at a vertex of a tetrahedron. Each of the cable control units includes a spool and an encoder configured to provide a signal corresponding to rotation of the respective spool. The cables are wound onto the spool of a respective one of the cable control units. The encoders provide signals corresponding to rotation of the respective spools to track the length of each cable. As the cables wind onto the spools, variations in spool diameter are compensated for. The absolute length of each cable is determined during initialization by retracting each cable In turn to a zero length position. A sensor array coupled to the tool detects rotation around one or more axes. | 02-09-2012 |
20120038546 | GESTURE CONTROL - An apparatus includes sensor circuitry configured to sense spatial phenomena; application circuitry configured to respond to sensation of a spatial phenomenon by the sensor circuitry where a pre-existing relationship exists between the spatial phenomenon and the response of the application circuitry; and regulation circuitry configured to regulate the response of the application circuitry to the spatial phenomenon based at least in part on sensation of a different spatial phenomenon by the sensor circuitry. Various other apparatuses, systems, methods, etc., are also disclosed. | 02-16-2012 |
20120038547 | METHOD AND APPARATUS FOR USER INTERFACE COMMUNICATION WITH AN IMAGE MANIPULATOR - A system, and method for use thereof, for image manipulation. The system may generate an original image in a three dimensional coordinate system. A sensing system may sense a user interaction with the image. The sensed user interaction may be correlated with the three dimensional coordinate system. The correlated user interaction may be used to project an updated image, where the updated image may be a distorted version of the original image. The image distortion may be in the form of a twisting, bending, cutting, displacement, or squeezing. The system may be used for interconnecting or communicating between two or more components connected to an interconnection medium (e.g., a bus) within a single computer or digital data processing system. | 02-16-2012 |
20120038548 | HANDHELD FIELD MAINTENANCE DEVICE WITH IMPROVED USER INTERFACE - A handheld field maintenance tool is provided. The handheld field maintenance tool includes a process communication module configured to communicate with a field device. The handheld field maintenance tool also includes a display and a user input device. A controller is coupled to the process communication module, the user input device and the display and is configured to generate a listing of task-based field maintenance operations on the display and receive a user input selecting a task-based field maintenance operation. The controller is configured to automatically traverse a menu of the field device using a fast-key sequence relative to the selected task. A method of creating a task-based field maintenance operation is provided. A method of interacting with a field device menu is also provided. | 02-16-2012 |
20120038549 | Deriving input from six degrees of freedom interfaces - The present invention relates to interfaces and methods for producing input for software applications based on the absolute pose of an item manipulated or worn by a user in a three-dimensional environment. Absolute pose in the sense of the present invention means both the position and the orientation of the item as described in a stable frame defined in that three-dimensional environment. The invention describes how to recover the absolute pose with optical hardware and methods, and how to map at least one of the recovered absolute pose parameters to the three translational and three rotational degrees of freedom available to the item to generate useful input. The applications that can most benefit from the interfaces and methods of the invention involve 3D virtual spaces including augmented reality and mixed reality environments. | 02-16-2012 |
20120038550 | SYSTEM ARCHITECTURE AND METHODS FOR DISTRIBUTED MULTI-SENSOR GESTURE PROCESSING - The techniques discussed herein contemplate methods and systems for providing, for example, interactive virtual experiences that are initiated or controlled using user gestures. In embodiments, the techniques provide for gestures performed by users holding devices to be recognized and processed in a cloud computing environment such that the gestures produce a predefined desired result. According to one embodiment, a server communicates with a first device in a cloud computing environment, wherein the first device can detect surrounding devices, and an application program is executable by the server, wherein the application program is controlled by the first device and the output of the application program is directed by the server to one of the devices detected by the first device. | 02-16-2012 |
20120044135 | REMOTE CONTROL FOR ELECTRONIC READER AND REMOTE CONTROL METHOD - A remote control comprises a button generating control signals; a transmission unit configured to transmit the control signals to an electronic reader; a microprocessor unit configured to analyze an operation type according to the control signals and generate operation signals corresponding to the operation type to signal the electronic reader to flip page. A remote control method applied in a remote control of an electronic reader is also provided. | 02-23-2012 |
20120044136 | DISPLAY DEVICE AND CONTROL METHOD THEREOF - Disclosed with are a display device and a control method thereof. The method includes recognizing a first motion of a user; and assigning a control right to perform functions of the display devices through motions to the user in response to the recognized first motion. | 02-23-2012 |
20120044137 | SCREEN CAPTURE - A screen capture system ( | 02-23-2012 |
20120044138 | METHOD AND APPARATUS FOR PROVIDING USER INTERACTION IN LASeR - A method and an apparatus for providing user interaction are provided. The apparatus for providing user interaction includes an input unit configured to receive control by a user; a control processing unit configured to analyze the control and generate drag event information including event type information indicating a type of the control and event attribute information; and an action processing unit configured to generate drag element information for showing an action corresponding to the control on a display. The drag element information includes action mode information indicating a mode of the action and action attribute information. The proposed method and apparatus make it possible to apply various data formats defined by existing standard specifications to other standard specifications and interaction devices. | 02-23-2012 |
20120056800 | SYSTEM FOR FAST, PROBABILISTIC SKELETAL TRACKING - A system and method are disclosed for recognizing and tracking a user's skeletal joints with a NUI system. The system includes one or more experts for proposing one or more skeletal hypotheses each representing a user pose within a given frame. Each expert is generally computationally inexpensive. The system further includes an arbiter for resolving the skeletal hypotheses from the experts into a best state estimate for a given frame. The arbiter may score the various skeletal hypotheses based on different methodologies. The one or more skeletal hypotheses resulting in the highest score may be returned as the state estimate for a given frame. It may happen that the experts and arbiter are unable to resolve a single state estimate with a high degree of confidence for a given frame. It is a further goal of the present system to capture any such uncertainty as a factor in how a state estimate is to be used. | 03-08-2012 |
20120056801 | METHODS AND APPARATUSES FOR GESTURE-BASED USER INPUT DETECTION IN A MOBILE DEVICE - Methods and apparatuses are provided that may be implemented in a mobile device to: determine whether the mobile device is in a gesture command input ready state based, at least in part, on a display portion of the mobile device remaining in a horizontal viewable position for a threshold period of time; with the mobile device in a gesture command input ready state, determine whether a detected movement of the mobile device represents a gesture command input; and in response to the determined gesture command input, affect a user perceivable output. | 03-08-2012 |
20120056802 | Program, Object Control Method, And Game Device - A behavior table storage unit stores correspondence between a predetermined action of an object and a condition for operation of an input device. A condition determination unit determines whether control information of the input device meets the condition for operation stored in the behavior table storage unit. An object control unit causes, when the condition for operation is determined to be met, the object to perform an action mapped to the condition for operation. The behavior table storage unit stores a condition for operation requiring that the input device be moved by a predetermined amount within a predetermined period of time, and the condition determination unit measures time elapsed since the start of movement of the input device and determines, when the input device is moved by the predetermined amount within the predetermined period of time, that the condition for operation is met. | 03-08-2012 |
20120056803 | CONTENT OUTPUT SYSTEM, OUTPUT CONTROL DEVICE AND OUTPUT CONTROL METHOD - Based on captured images obtained when an image capture device takes images in an image capture range, a change in the moving speed of one passerby or each of a plurality of passerby contained in the captured images is calculated. When there is a passerby whose moving speed is decreased in the image capture range, the passerby is paying attention to the content, and therefore content to be outputted by a content output device is switched from ordinary content to specific content. On the other hand, when only passerby who are moving at constant speed are present in the image capture range, the passerby are not paying attention to the content, and hence the content output device continues to output the ordinary content. | 03-08-2012 |
20120056804 | Apparatus, Methods And Computer Program Products Providing Finger-Based And Hand-Based Gesture Commands For Portable Electronic Device Applications - A method includes executing a gesture with a user-manipulated physical object in the vicinity of a device; generating data that is descriptive of the presence of the user-manipulated object when executing the gesture; and interpreting the data as pertaining to at least one object, such as an object displayed by the device. | 03-08-2012 |
20120062452 | METHOD AND APPARATUS FOR TEACHING A CHILD WITH AN ELECTRONIC DEVICE - A handheld electronic device that promotes learning by movement The device includes software that guides the user through various activities via audio and visual cues. The user interacts with the handheld device with movement. | 03-15-2012 |
20120062453 | GESTURE CONTROL SYSTEM - A method for controlling a device comprises: a) providing a mobile device comprising a camera; b) positioning said mobile device such that said camera acquires the image of an operator's hands; c) analyzing the movements of said operator's hands to derive a control command therefrom; and d) transmitting said control command to a controlled device. | 03-15-2012 |
20120062454 | Information Processing System - Provided is an information processing system which includes: a communication device including a sensor that measures information regarding a posture thereof; and an information processing device. The information processing device displays a guide image for causing a user to perform a rotational operation for rotating the communication device, and executes calibration of the sensor by using the measurement result of the sensor acquired from the communication device while the guide image is displayed. The guide image includes an image representing a reference axis to be a rotation center for the rotational operation and an image representing the communication device, and the image representing the communication device is located within the guide image so that a portion of the communication device corresponding to a position of the sensor overlaps with the reference axis. | 03-15-2012 |
20120062455 | MOTION BASED DISPLAY MANAGEMENT - A display manager is configured to handle the drawing of windows on one or more displays for an application differently based on detected motion information that is associated with a device. The display manager may not display windows for some applications while motion is detected, while the display manager may display windows for other applications even when motion is detected. Motion enabled applications may interact with the display manager and motion information to determine how to display windows while motion is detected. | 03-15-2012 |
20120068917 | SYSTEM AND METHOD FOR DYNAMIC GESTURE RECOGNITION USING GEOMETRIC CLASSIFICATION - A gesture recognition system and method that inputs videos of a moving hand and outputs the recognized gesture states for the input sequence. In each image, the hand area is segmented from the background and used to estimate parameters of all five fingers. The system further classifies the hand image as one of the postures in the pre-defined database and applies a geometric classification algorithm to recognize the gesture. The system combines a skin color model with motion information to achieve real-time hand segmentation performance, and considers each dynamic gesture as a multi-dimensional volume and uses a geometric algorithm to classify each volume. | 03-22-2012 |
20120068918 | METHOD AND APPARATUS FOR ELECTRONIC READER OPERATION - Methods and apparatus are provided for operation of an electronic reader. In one embodiment, a method includes detecting a user command to initiate playback of the digital text, detecting a playback setting for the digital text based on the user command, displaying a first portion of the digital text by the electronic reader, and updating the display of the digital text by the electronic reader, wherein a second portion of the digital text is automatically displayed based on the playback setting for the digital text. | 03-22-2012 |
20120068919 | SENSOR - A magnetic attachment mechanism and method is described. The magnetic attachment mechanism can be used to releasably attach at least two objects together in a preferred configuration without fasteners and without external intervention. The magnetic attachment mechanism can be used to releasably attach an accessory device to an electronic device. The accessory device can be used to augment the functionality of usefulness of the electronic device. | 03-22-2012 |
20120068920 | METHOD AND INTERFACE OF RECOGNIZING USER'S DYNAMIC ORGAN GESTURE AND ELECTRIC-USING APPARATUS USING THE INTERFACE - A method of recognizing a user's dynamic organ for use in an electric-using apparatus includes scanning a target image inputted through an imaging element using a window; generating a HOG descriptor of a region of the target image that is scanned when it is judged that the scanned region includes a dynamic organ; measuring a resemblance value between the HOG descriptor of the scanned region and a HOG descriptor of a query template for a gesture of the dynamic organ; and judging that the scanned region includes the gesture of the dynamic organ when the resemblance value meets a predetermined condition. | 03-22-2012 |
20120068921 | WIRELESS VIDEO HEADSET WITH SPREAD SPECTRUM OVERLAY - Enhanced Bluetooth and/or cellular frequency hopping radios are integrated into a hands-free wireless mobile computing and video display headset. Forms of these enhanced headsets incorporating the enhanced frequency hopping spread spectrum radio technology are of interest to military, police, fire fighters, first responders and certain commercial companies such as utility companies seeking private cellular systems seeking enhanced communication privacy. | 03-22-2012 |
20120068922 | Remote Control Functionality Including Information From Motion Sensors - According to some aspects, the invention provides methods and apparatuses for incorporating motion sensors into a full function remote control. In addition to using movement for cursor location on an associated display, the invention can use the motion sensor information in many new and useful ways. As one example, information about movement along the ±Z axis can be used to activate a “zoom in” function when remote pointed toward the screen and “zoom out” when it is pulled back. As another example, a remote control incorporating the invention can include controls on two opposite sides, and the motion sensors can be used to activate controls on one side of the device and deactivate controls on the other side based on its orientation. | 03-22-2012 |
20120075173 | APPARATUS AND METHOD FOR USER INPUT - The present invention provides a method, apparatus, and computer program product for providing input to a user device by way of a device that is worn by a user. The method including receiving sensor information of a device configured to be worn by a user, determining a motion input indicated by the received information such that the motion input relates to motion of the device relative to the user, determining a function based at least in part on the motion input, and causing the function to be performed. The device may be configured to be worn on a device bearing part of the user and the motion input may relate to motion of the device relative to the device bearing part of the user. The device bearing part of the user may be a finger and the device may substantially encircle the finger. | 03-29-2012 |
20120075174 | INPUT APPARATUS AND ELECTRONIC SYSTEM USING SAME - An input apparatus and an electronic system using the same are provided. The input apparatus includes a casing, a sensor and a processing circuit. The casing has a first surface and a second surface arranged in opposite side. The first surface has a first input interface and the second surface has a second input interface. The sensor is disposed in the casing and is configured for sensing whether one of the first and second surfaces is towards a predetermined direction, so as to generate a sensing result. The processing circuit is arranged in the casing and is in communication with the first input interface, the second input interface and the sensor, to determine based on the sensing result whether to perform an operation according to a command inputted from the first input interface or to perform an operation according to a command inputted from the second input interface. | 03-29-2012 |
20120075175 | METHOD AND DEVICE FOR PROVIDING SYSTEM STATUS INFORMATION - The present disclosure provides a method and device for providing system status information. The method comprises: receiving, from an input mechanism associated with a communication device, a request to share system status information; and in response to receiving the request to share the system status information: (i) obtaining system status information associated with the communication device; and (ii) automatically populating one or more portions of an electronic message based on the system status information. The system status information comprises processor usage information. | 03-29-2012 |
20120075176 | METHOD AND APPARATUS OF RECOGNIZING GESTURE WITH UNTOUCHED WAY - Provided is a method of recognizing a gesture, which includes: storing sensing information of a sensor in a case where the sensing information is obtained by sensing an object within a preset distance from the sensor; and recognizing a gesture from the stored sensing information, wherein said storing of sensing information stores the sensing information obtained by the sensor during a preset time after the sensor senses an object within the preset distance. This method allows a terminal and contents to be controlled by recognizing a gesture of a user even though the user does not touch the terminal screen. | 03-29-2012 |
20120075177 | LAPEL MICROPHONE MICRO-DISPLAY SYSTEM INCORPORATING MOBILE INFORMATION ACCESS - A shoulder mounted lapel microphone housing that encloses a microdisplay, a computer, and other communication system components. A microdisplay element is located on or in the microphone housing. Other electronic circuits, such as a microcomputer, one or more wired and wireless interfaces, associated memory or storage devices, auxiliary device mounts and the like are packaged in the microphone housing and/or in an optional pager sized gateway device having a belt clip. Motion, gesture, and/or audio processing circuits in the system provide a way for the user to input commands to the system without a keyboard or mouse. The system provides connectivity to other computing devices such as cellular phones, smartphones, laptop computers, or the like. | 03-29-2012 |
20120075178 | APPARATUS AND METHOD FOR GENERATING DYNAMIC RESPONSE - A dynamic response generating apparatus and method that may analyze an intention of a user based on user input information received from an inputting device, may analyze at least one of first response information with respect to the analyzed intention of the user, context information associated with the user input information, user motion information, and environmental information, may dynamically determine a modality with respect to the first response information, may process the first response information, and may dynamically generate second response information in a form of via the determined modality. | 03-29-2012 |
20120075179 | ELECTRONIC DEVICE - The electronic device has a displaying unit which displays information; an operating unit which accepts a contactless oberation by a user; a detecting unit which detects whether or not on object exists inside the area where the contact less operation is capable, and a lighting unit which lights when the detecting unit detected the existence of the object. | 03-29-2012 |
20120075180 | MOBILE ELECTRONIC APPARATUS AND CONTROL METHOD OF MOBILE ELECTRONIC APPARATUS - Provided herein is a mobile electronic apparatus or a control method of the mobile electronic apparatus, the mobile electronic apparatus including an operating unit for performing a character input operation, an acquiring unit that acquires a photographed image of a subject, an analyzing unit that analyzes the photographed image acquired by the acquiring unit and extracts character information contained in the photographed image, and a control unit that detects an input operation as a character input operation when the input operation is performed with the operating unit while the photographed image is being analyzed by the analyzing unit. Accordingly, detecting an input operation in the operating unit as entering characters even during an analysis in the analyzing unit makes it possible to use time efficiently. | 03-29-2012 |
20120081275 | Media Display Device - A media display device is described. In an embodiment the media display device comprises a display screen and at least one loudspeaker held in a housing rotatably mounted on a lid. For example, in a one handed operation a user is able to rotate the housing to open the device and reveal the display screen which is held upwards using the lid as a stand. For example, the action of opening the device is detected by a sensor and triggers the device to randomly select an item of media content and to display that. For example, images, audio clips, contacts or other items that a user has not opened for some time are presented. The device may randomly select the media type in some embodiments. In an example the sensor is provided by a rotary encoder which also provides part of a hinge for mounting the housing and lid. | 04-05-2012 |
20120081276 | PHYSICAL MODEL BASED GESTURE RECOGNITION - A gesture recognition system for recognizing gestures on a mobile device receives sensor data in response to a sensed gesture on the mobile device. The sensor data includes a force or impulse. The force or impulse is applied to the simulated physical object and the state of the simulated physical object is then observed. Input is provided to an application based at least on the observed state of the simulated physical object. | 04-05-2012 |
20120081277 | MULTI-SCREEN USER INTERFACE WITH ORIENTATION BASED CONTROL - Control of a plurality of displays of a computing device in response to the change in orientation of the computing device. The computing device may be a handheld computing device with a plurality of displays that are concurrently visible by a user. The displays may be capable of displaying a graphical user interface (GUI). The plurality of displays may be modified in response to a change in orientation of the handheld computing device. The modification may include expanding a GUI that is displayed in a single display when in a first orientation to occupy at least two of the plurality of displays in response to the change in orientation. | 04-05-2012 |
20120081278 | USER INTERFACE WITH SCREEN SPANNING ICON MORPHING - Methods and apparatus for indicating a status of an application that is displayable on one or more displays of a handheld computing device. An icon may be provided that indicates the status and/or potential statuses of the application (e.g., whether the application is expandable and/or expanded). The icon may be changeable between a first state and a second state depending on the status of the application. The change in the icon from the first state to the second state may be animated along with an animated change of the application between display states. As such, a user may observe the icon to determine the status of the application with respect to the one or more displays (e.g., whether the application is expandable, expanded, or expanding). | 04-05-2012 |
20120081279 | Dynamic Display Adjustment Based on Ambient Conditions - The techniques disclosed herein use a display device, in conjunction with various optical sensors, e.g., an ambient light sensor or image sensors, to collect information about the ambient conditions in the environment of a viewer of the display device. Use of these optical sensors, in conjunction with knowledge regarding characteristics of the display device, can provide more detailed information about the effects the ambient conditions in the viewer's environment may have on the viewing experience. A processor in communication with the display device may create an ambient model based at least in part on the predicted effects of the ambient environmental conditions on the viewing experience. The ambient model may be used to adjust the gamma, black point, white point, or a combination thereof, of the display device's tone response curve, such that the viewer's perception remains relatively independent of the ambient conditions in which the display is being viewed. | 04-05-2012 |
20120081280 | SINGLE-SCREEN VIEW IN RESPONSE TO ROTATION - A multi-screen user device and methods for controlling data displayed thereby are disclosed. Specifically, a gesture sequence is disclosed which enables a user to toggle or shift though applications that are displayed by the multi-screen user device. The gesture sequence may correspond to various rotation or partial rotations of the multi-screen user device. | 04-05-2012 |
20120081281 | INFORMATION DISPLAY APPARATUS FOR MAP DISPLAY - An information display apparatus, including a nonvolatile database memory | 04-05-2012 |
20120081282 | ACCESS OF AN APPLICATION OF AN ELECTRONIC DEVICE BASED ON A FACIAL GESTURE - A method of accessing an application of an electronic device based on a facial gesture is disclosed. In one aspect, a method of an electronic device includes capturing an image of a face a user through a camera of the electronic device such that an application of the electronic device is accessible through the electronic device based on the image of the face of the user. The facial gesture of the image of the face of the user of the electronic device is determined to be associated with a user-defined facial gesture. The facial gesture of the image of the face of the user is compared with a designated security facial gesture. An access of the application of the electronic device is permitted when the facial gesture of the image of the face of the user of the electronic device matches the designated security facial gesture. | 04-05-2012 |
20120086629 | ELECTRONIC DEVICE HAVING MOVEMENT-BASED USER INPUT AND METHOD - To enhance user control of an electronic device in a simple and intuitive way, the electronic device includes a movement-based user input function that is used to invoke display of a menu and selection of a menu item from the menu. | 04-12-2012 |
20120086630 | USING A PORTABLE GAMING DEVICE TO RECORD OR MODIFY A GAME OR APPLICATION IN REAL-TIME RUNNING ON A HOME GAMING SYSTEM - Methods and systems for enabling a user to interface with an interactive application using a handheld device are provided. According to embodiments of the invention, a primary video stream of an interactive application is rendered on a display. Simultaneously, a data feed of the interactive application is transmitted to a handheld device. The data feed is processed on the handheld device to produce an ancillary video stream which is rendered on the handheld device. Interactive input is received at the handheld device while rendering the ancillary video stream. The interactive input is applied to set a virtual tag which defines an event to be rendered on the display when the state of the interactive application reaches a predetermined configuration so as to trigger execution of the virtual tag by the interactive application. | 04-12-2012 |
20120086631 | SYSTEM FOR ENABLING A HANDHELD DEVICE TO CAPTURE VIDEO OF AN INTERACTIVE APPLICATION - Methods and systems for enabling a handheld device to capture video of an interactive session of an interactive application presented on a main display are provided. An interactive session of the interactive application defines interactivity between a user and the interactive application. An initial position and orientation of a handheld device operated by a spectator are determined. A current state of the interactive application based on the interactivity between the user and the interactive application is determined The position and orientation of the handheld device are tracked during the interactive session. A spectator video stream of the interactive session based on the current state of the interactive application and the tracked position and orientation of the handheld device is generated. The spectator video stream is rendered on a handheld display of the handheld device. | 04-12-2012 |
20120086632 | ELECTRIC POWER INFORMATION USER INTERFACE, DISPLAY METHOD AND APPARATUS THEREOF - Disclosed herein relates to an electric power information user interface, an information display method and an apparatus thereof. The interface is used to display the electricity consumption to be detected. The apparatus may display diverse power information through several functional combo keys. The interface is particularly implemented by software. The interface includes a value-display area, a parameter-type area with several changeable symbols or texts, a parameter-unit area indicative of unit of the value, a statistics-mode area for providing inquiry into the electric power information at different period zones, and a functional-key area. The functional-key area has several multi-functional keys and an energy-count key that is used to generate a reset and a stop command by repeating triggering. The energy-count key particularly provides the electric power information within a certain period. | 04-12-2012 |
20120086633 | Method For Measuring Distance Between Coordinate Indicator and Coordinate Input Device - The invention relates to a method for measuring a distance between a coordinate indicator and a coordinate input device, which belongs to the technical field of computer peripheral equipments. The method comprises the steps of measuring an actual amplitude of an output signal of a receiving coil in the electromagnetic induction plate by a measurement circuit in the electromagnetic induction plate; determining a distance between the coordinate indicator and the coordinate input device in accordance with the actual amplitude of the output signal, a nominal amplitude of the output signal, and a preset reference amplitude of signal. In the present invention, the distance between the coordinate indicator and the coordinate input device is determined by converting an actual amplitude of the output signal of the receiving coil at one time to a nominal amplitude of the output signal normalized by a reference amplitude of signal, which overcomes the defect of the prior art which can not indicate a distance between the coordinate indicator and the coordinate input device. | 04-12-2012 |
20120086634 | MULTI-DIRECTION INPUT DEVICE - Provided is a multi-direction input device. The multi-direction input device includes a manipulation member moved in a multi-direction by a user's manipulation, a hinge part disposed to surround at least one side of the manipulation member, the hinge part returning to an initial position of the manipulation member by elasticity, and a fixing member configured to fix at least one portion of the hinge part. | 04-12-2012 |
20120092244 | ELECTRONIC CONTROL MODULE INTERFACE SYSTEM FOR A MOTOR VEHICLE - An electronic control module interface system may include a sensing surface having a symbol representative of a function of a motor vehicle, a sensing device coupled to the sensing surface, configured to sense a proximity between a physical pointer and the sensing surface and to generate a browsing signal when the proximity is less than a predefined proximity threshold value, and configured to generate a selection signal when the physical pointer contacts or depresses the sensing surface, and an output device coupled to the sensing device, and configured to produce an output representative of the function of the motor vehicle upon receiving the browsing signal. | 04-19-2012 |
20120092245 | MOBILE DEVICE WITH ROTATABLE PORTION FOR INPUT MECHANISM - Disclosed herein is a mobile device comprising a base providing a display and a rotatable portion providing an input mechanism. The rotatable portion is rotatable about a coupling between at least a first position in which the rotatable portion covers a part of the display and a second position in which the rotatable portion exposes more of the display while leaving the input mechanism accessible to a user. In at least one variant embodiment, the rotatable portion may be rotatable to a third position “behind” the base with the input mechanism remaining accessible to the user. | 04-19-2012 |
20120092246 | ELECTRONIC APPARATUS AND CONTROL METHOD THEREOF, AND EXTERNAL APPARATUS AND CONTROL METHOD THEREOF - An electronic apparatus and a control method thereof, and an external apparatus and a control method thereof. The electronic apparatus includes a communication unit which communicates with an external apparatus, and a controller which performs an operation corresponding to a key signal received through the communication unit, receives a setup signal through the communication unit to reset up an operation corresponding to a first key signal, and performs a second operation based on a result of the reset up when the first key signal is received through the communication unit. The setup signal sets up the first operation corresponding to the first key signal to be replaced by the second operation different from the first operation. | 04-19-2012 |
20120092247 | COMPUTER INPUT DEVICE - Computer input devices are described herein for use in manipulating digital images on a display apparatus. | 04-19-2012 |
20120092248 | METHOD, APPARATUS, AND SYSTEM FOR ENERGY EFFICIENCY AND ENERGY CONSERVATION INCLUDING DYNAMIC USER INTERFACE BASED ON VIEWING CONDITIONS - In general, in one aspect, a viewing configuration detector collects data for a viewing area of a consumer electronics device and determines viewing configuration based on the collected data. A dynamic user interface controller determines a perceived appropriate configuration for the content presented on the consumer electronics device based on the viewing configuration and if necessary modifies the configuration of the content. The modifying is done without receiving input to make the change from a user and may be done to conserve power or enhance user experience. The viewing configuration detector may include a light transmitter to transmit light in direction of the user and a light receiver to receive reflected light and determine distance and/or location of user based theeron. The viewing configuration detector may include a camera to capture images of the viewing area and image recognition functionality to detect the user and different attributes associated therewith. | 04-19-2012 |
20120092249 | Accessing Accelerometer Data - Systems and processes for accessing acceleration data may include an accelerometer coupled to a nonvolatile memory. The nonvolatile memory may be coupled to a processor. Acceleration data may be obtained from the accelerometer via a bus coupling the nonvolatile memory to the accelerometer. Acceleration data may be sent from the nonvolatile memory to a processor. One or more operations may be performed based on the acceleration data. | 04-19-2012 |
20120092250 | FINGER-OPERATED INPUT DEVICE - The finger-operated input device connectable to a digital device is disclosed. The aforesaid input device comprises (a) a sensing pad being responsive to displacement of the finger substantially parallel to the pad in a reach range thereof and transmitting electrical signals in response thereto; and (b) an analyzing unit arranged to receive said electrical signals and calculate a data defining trajectory of the finger and transmit the data to the digital device; The sensing pad comprises at least three sensors perceptive to motion of the finger relative to the sensors. | 04-19-2012 |
20120092251 | OPERATION SYSTEM FOR VEHICLE - An operation system for a vehicle includes, a display unit; a proximity detecting unit that detects that a part of a human body approaches the display unit; a position specifying unit that specifies positions on the display unit where the part of the human body approached on the basis of a detection result of the proximity detecting unit; and a display controlling unit that makes an icon display screen, which displays icons corresponding to respective operation items of a plurality of in-vehicle devices, and a selection screen, which selects set values or operation modes of the operation items corresponding to the icons which are displayed at the positions when the positions are specified by the position specifying unit, are displayed on the display unit; wherein the icons are formed of at least characters or figures representing the set values or the operation modes that are currently selected, and wherein the display controlling unit makes the icon display screen, on which all operable icons are displayed, be displayed on one screen of the display unit. | 04-19-2012 |
20120105312 | User Input Device - A user input device is described. In an embodiment the user input device is hand held and comprises a sensing strip to detect one-dimensional motion of a user's finger or thumb along the sensing strip and to detect position of a user's finger or thumb on the sensing strip. In an embodiment the sensed data is used for cursor movement and/or text input at a master device. In an example the user input device has an orientation sensor and orientation of the device influences orientation of a cursor. For example, a user may move the cursor in a straight line in the pointing direction of the cursor by sliding a finger or thumb along the sensing strip. In an example, an alphabetical scale is displayed and a user is able to zoom into the scale and select letters for text input using the sensing strip. | 05-03-2012 |
20120105313 | PROJECTION DEVICE HAVING DISPLAY CONTROL FUNCTION AND METHOD THEREOF - A projection device having a display control function is provided. The projection device includes a document unit, a display unit, a lens module and a sensor unit. The sense unit senses distance and direction of movement of the projection device. The document unit scrolls through content of an opened document according the distance and direction of the sensed movement of the projection device from the sensor unit to select the content to be displayed, and control display of the selected content of a currently opened document on the display unit. The lens module projects the selected content currently displayed by the display unit. A method with a display control function is also provided. | 05-03-2012 |
20120105314 | METHOD AND SYSTEM FOR INPUTTING INFORMATION USING ULTRASONIC SIGNALS - Provided are an information input system and an information input method using an ultrasonic signal. In the information input system or method, a signal generation apparatus for generating a reference signal and an ultrasonic signal or an information input apparatus for generating a reference signal receives a reference signal and an ultrasonic signal which are generated by a different signal generation apparatus which inputs information in an adjacent area or the information input apparatus, determines whether or not there is a possibility of interference of the signals generated by the different signal generation apparatus or the information input apparatus, and adaptively changes an effective section where the reference signal and the ultrasonic signal are generated to an idle section in the same period or the next period if it is determined that there is a possibility of the interference, so that the signal interference is prevented. Therefore, even in the environment where a plurality of the information input systems are present, it is possible to stably input information. | 05-03-2012 |
20120105315 | Virtual Controller For Visual Displays - Virtual controllers for visual displays are described. In one implementation, a camera captures an image of hands against a background. The image is segmented into hand areas and background areas. Various hand and finger gestures isolate parts of the background into independent areas, which are then assigned control parameters for manipulating the visual display. Multiple control parameters can be associated with attributes of multiple independent areas formed by two hands, for advanced control including simultaneous functions of clicking, selecting, executing, horizontal movement, vertical movement, scrolling, dragging, rotational movement, zooming, maximizing, minimizing, executing file functions, and executing menu choices. | 05-03-2012 |
20120105316 | Display Apparatus - The display apparatus has a displaying unit which displays an image, a first specifying unit which specifies direction of the person in the image captured by a plurality of imaging apparatuses arranged on different positions; a second specifying unit which specifies one of the plurality of imaging apparatuses for capturing and for displaying an image, in accordance with direction instructed by user and the direction specified by the first specifying unit. | 05-03-2012 |
20120105317 | MOBILE ELECTRONIC DEVICE - According to an aspect, a mobile electronic includes a first display unit, a second display unit, an input unit, and a control unit. The first display unit displays a first image. The second display unit displays a second image. To the input unit, an instruction is input. The control unit causes the second display unit to display the first image, as the second image, when a first period of time has passed since the first image is displayed by the first display unit. | 05-03-2012 |
20120112994 | Interaction Techniques for Flexible Displays - The invention relates to a set of interaction techniques for obtaining input to a computer system based on methods and apparatus for detecting properties of the shape, location and orientation of flexible display surfaces, as determined through manual or gestural interactions of a user with said display surfaces. Such input may be used to alter graphical content and functionality displayed on said surfaces or some other display or computing system. | 05-10-2012 |
20120112995 | Information Processing Apparatus, Information Processing Method, and Computer-Readable Storage Medium - A method is provided for generating a command to perform a predetermined operation. The method comprises acquiring at least a first input and a second input from among a plurality of inputs. The method further comprises determining first semantic information associated with the first input. The method also comprises determining second semantic information associated with the second input. The method also comprises generating a command to perform a predetermined operation, based a combination of the determined first and second semantic information. | 05-10-2012 |
20120112996 | INTERACTIVE POINTING DEVICE - An interactive pointing device having pointing function in space and game control is provided in the present disclosure. The interactive pointing device comprises an accelerometer module, a dual-axis gyroscope device, and a micro processing unit. The accelerometer module functions as sensing the movement of the operator and generates at least one axis of accelerating signal corresponding to the sensed movement. The dual-axis gyroscope device functions as sensing rotation status of the interactive pointing device about dual axis and generate a corresponding rotating signal. The micro processing unit processes the at least one axis of accelerating signal and the rotating signal so as to interact with an electronic device accordingly. The micro processing unit is able to use the rotating signal to compensate the accelerating signal and assist an action evaluation process of the interactive pointing device operating under different operation modes. | 05-10-2012 |
20120112997 | USER INTERFACE METHOD AND APPARATUS FOR DATA FROM DATA CUBES AND PIVOT TABLES - Systems, methods, and computer readable media provide space-efficient user interfaces to data cubes and pivot table information. Because the user interfaces are more efficient in usage of display area, smaller displays can be used more effectively in reviewing such data. The user interfaces provide a multi-dimensional navigation approach among dimensions represented in the data, which allows users to more easily maintain context when reviewing large pivot table reports, and the like. Other user interface features that ease review of such reports on smaller devices also are disclosed. | 05-10-2012 |
20120119984 | HAND POSE RECOGNITION - Hand pose recognition comprises determining an initial hand pose estimate for a captured input hand pose and performing iterations based upon hand pose estimates and residues between such estimates and hand pose estimates. One or more control signals are generated based upon the hand pose recognition. | 05-17-2012 |
20120119985 | METHOD FOR USER GESTURE RECOGNITION IN MULTIMEDIA DEVICE AND MULTIMEDIA DEVICE THEREOF - A display device includes a first sensor to detect a first image of a person and a second sensor to detect a second image of the person. A storage device stores first information and second information, where the first information identifies a plurality of gestures mapped to respective ones of a plurality of functions identified by the second information. A processor recognizes a gesture of the person based on the first and second images, and performs a function corresponding to the recognized gesture based on the first and second information. | 05-17-2012 |
20120119986 | THREE-DIMENSIONAL CONTROL APPARATUS OF COMPUTER INPUT DEVICE AND METHOD THEREOF - A three-dimensional (3D) control apparatus for a computer input device, used to sense 3D motion relations to output multiple 3D control signals, includes a first element comprising a plurality of sensing units and a second element comprising a plurality of transmitting coils. Each sensing unit of the first element has two adjacent receiving coils. The second element is capable of performing a 3D multi-axial motion with respect to the first element to generate multiple 3D control signals according to relative position relations between the plurality of transmitting coils and the plurality of receiving coils, so as to perform 3D control. | 05-17-2012 |
20120119987 | METHOD AND APPARATUS FOR PERFORMING GESTURE RECOGNITION USING OBJECT IN MULTIMEDIA DEVICES - According to an embodiment of the present invention, a gesture recognition method for use in a multimedia device includes capturing, via an image sensing unit of the multimedia device, a peripheral image, recognizing a first object contained in the captured peripheral image and a gesture made using the first object, mapping a multimedia device operation to the gesture, and entering into an input standby mode associated with the gesture. | 05-17-2012 |
20120119988 | IMAGE RECOGNITION APPARATUS, OPERATION DETERMINING METHOD AND COMPUTER-READABLE MEDIUM - An accurate determination of an operation is possible. Data photographed by a video camera is read by an image reading unit, and an image of an operator is extracted from the data by an image extracting unit. As a result of such preparation, a virtual operation screen and an operation region are created based upon the extracted image of the operator. In a case of an adult operator, an operation region can be created in consideration with a length (position of sight line) or a length of an arm, and in a case of a child, since a length is lower and a length of an arm is shorter, an operation region can be set to match it. | 05-17-2012 |
20120127069 | Input Panel on a Display Device - A device including a sensor to detect a holding position of a user of the device, an orientation sensor to detect an orientation of the device, and a controller to render an input panel on at least one location of a display device based on the holding position of the user and the orientation of the device. | 05-24-2012 |
20120127070 | CONTROL SIGNAL INPUT DEVICE AND METHOD USING POSTURE RECOGNITION - Provided are a control signal input device and method using posture recognition. More particularly, the present invention relates to a control signal input device including: a database unit storing predetermined system control commands corresponding to postures of combinations of one or more of an arm, a wrist, and fingers of a user; a sensing unit sensing a posture of a combination of the arm, wrist, and fingers of the user; and a control signal generating unit extracting a system control command corresponding to the sensed result of the sensing unit from the database unit and generating a control signal for controlling the system, and a control signal input method using the same. | 05-24-2012 |
20120127071 | Haptic Feedback to Abnormal Computing Events - A computer-implemented tactile feedback method includes receiving user input on a computing device, identifying a term input by the user that does not match a term known to the device, accessing an auto-correction service in order to provide a replacement for the term, and energizing a haptic feedback device in response to identifying the term input by the user that does not match a known term. | 05-24-2012 |
20120127072 | CONTROL METHOD USING VOICE AND GESTURE IN MULTIMEDIA DEVICE AND MULTIMEDIA DEVICE THEREOF - A multimedia device and a method for controlling the same are disclosed, in which voice and gesture of a user are recognized by the multimedia device to allow the user to execute a desired operation. The method Includes enabling an input of a remote controller input of a gesture and a voice; receiving user the gesture and the voice through the remote controller; identifying a first command associated with the received gesture; identifying a second command associated with the received voice; comparing the first command and the second command to each other; and performing a function associated with the first or second command when the comparing step indicates that the first command corresponds to the second command. The multimedia device executes the operation desired by the user. | 05-24-2012 |
20120133579 | GESTURE RECOGNITION MANAGEMENT - A system and method for managing the recognition and processing of gestures. A system provides a mechanism to detect conflicts between gesture recognizers and resolve the conflicts. A runtime system receives notifications from gesture recognizers in the form of requests for resources or actions. A conflict detector determines whether a conflict with another gesture recognizer exists. If a conflict exists, a conflict resolver determines a resolution. This may include determining a winning gesture recognizer and deactivating the losing gesture recognizers. A design time system statically validates gesture recognizers based on static state machines corresponding to each gesture recognizer. | 05-31-2012 |
20120133580 | SYSTEM AND METHOD FOR GESTURE INTERFACE CONTROL - A method is provided in one example and includes generating a histogram associated with at least one object; receiving image data; comparing the image data to the histogram in order to determine if at least a portion of the image data corresponds to the histogram; identifying a pose associated with the object; and triggering an electronic command associated with the pose. In more particular embodiments, the image data is evaluated in order to analyze sequences of poses associated with a gesture that signals the electronic command to be performed. | 05-31-2012 |
20120133581 | HUMAN-COMPUTER INTERACTION DEVICE AND AN APPARATUS AND METHOD FOR APPLYING THE DEVICE INTO A VIRTUAL WORLD - A human-computer interaction device and an apparatus and method for applying the device into a virtual world. The human-computer interaction device is disposed with a sensing device thereon, the sensing device including a manipulation part and a distance sensor. The manipulation part receives a manipulation action of a user's finger, the distance sensor senses a distance of the manipulation part relative to a fixed location and generates a distance signal for characterizing the manipulation action. A virtual world assistant apparatus and a method corresponding to the assistant apparatus is also provided. With the invention, multiple signals of manipulation can be sensed and free control on actions of an avatar can be realized by using the multiple signals. | 05-31-2012 |
20120139826 | User Control of the Display of Matrix Codes - An electronic device determines to transmit an image including a matrix code to a display, receives input specifying to alter the matrix code, generates an updated image according to the input, and transmits the updated image to the display. The device may alter a size and/or position of the matrix code, a display duration and/or complexity of the matrix code, and so on. The device may generate the matrix code and modify it in response to input, receive different matrix code versions and select a different version in response to input, receive the image including the matrix code and generate a replacement to overlay over the image, and so on Additionally, independent of input, the device may receive an image, detect an included first matrix code, generate a second matrix code based on the first, and generate an updated image by adding the second matrix code to the image. | 06-07-2012 |
20120139827 | METHOD AND APPARATUS FOR INTERACTING WITH PROJECTED DISPLAYS USING SHADOWS - A method, computer readable medium and apparatus for interacting with a projected image using a shadow are disclosed. For example, the method projects an image of a processing device to create the projected image, and detects a shadow on the projected image. The method interprets the shadow as a display formatting manipulation command and sends the display formatting manipulation command to an application of the processing device to manipulate a display format of the projected image. | 06-07-2012 |
20120139828 | Communication And Skills Training Using Interactive Virtual Humans - A system for providing interaction between a virtual human and a user, the system comprising: a tangible interface providing a physical interface between the user and the virtual human, an imaging system directed towards the physical interface to provide images of the user interacting with the tangible interface; a tracking system tracking at least one position or the user; a microphone capturing speech from the user; a simulation system receiving inputs from the tangible interface, the imaging system, the tracking system and the microphone, the simulation system generating output signals corresponding to the virtual human; and a display presenting the output signals to the user. | 06-07-2012 |
20120139829 | DISPLAY DEVICE - A technique that can simply determine pass/failure of signal interconnects for chipping or cracking of a substrate without increase in the number of manufacturing steps. | 06-07-2012 |
20120139830 | APPARATUS AND METHOD FOR CONTROLLING AVATAR USING EXPRESSION CONTROL POINT - An apparatus and method for controlling an avatar using expression control points are provided. The apparatus may track positions of feature points and a head movement from an expression of a user, remove the head movement from the positions of the feature points, and generate expression control points. Accordingly, the expression of the user may be reflected to a face of the avatar. | 06-07-2012 |
20120139831 | METHOD AND APPARATUS FOR SWITCHING INPUT METHODS OF A MOBILE TERMINAL - The present invention provides a method and apparatus for switching input methods of a mobile terminal, the method comprises: Step A: receiving an operation instruction from a key to call out an input method selecting interface; Step B: reading a selection of an input method in the input method selecting interface; Step C: entering an editing environment of the selected input method; Step D: judging whether a plurality of input characters, which are not yet interpreted before switching the input method, are meaningful to the input method after switching; and Step E: if the plurality of input characters are meaningful, then interpreting the input character which is not yet interpreted by using the input method after switching. A user can input information into a mobile terminal more conveniently and rapidly through the present invention, and the input speed and user experience are improved. | 06-07-2012 |
20120139832 | Head Pose Assessment Methods And Systems - Improvements are provided to effectively assess a user's face and head pose such that a computer or like device can track the user's attention towards a display device(s). Then the region of the display or graphical user interface that the user is turned towards can be automatically selected without requiring the user to provide further inputs. A frontal face detector is applied to detect the user's frontal face and then key facial points such as left/right eye center, left/right mouth corner, nose tip, etc., are detected by component detectors. The system then tracks the user's head by an image tracker and determines yaw, tilt and roll angle and other pose information of the user's head through a coarse to fine process according to key facial points and/or confidence outputs by pose estimator. | 06-07-2012 |
20120146890 | HAPTIC ROCKER BUTTON FOR VISUALLY IMPAIRED OPERATORS - In some embodiments, a device includes a converter, a haptic rocker button, a sensor, and a controller. In some embodiments, a method include receiving data in a mobile device, converting the data into Braille content, presenting the Braille content on a haptic rocker button, and controlling a horizontal movement of the Braille content on the haptic rocker button. | 06-14-2012 |
20120146891 | ADAPTIVE DISPLAYS USING GAZE TRACKING - Methods and systems for adapting a display screen output based on a display user's attention. Gaze direction tracking is employed to determine a sub-region of a display screen area to which a user is attending. Display of the attended sub-region is modified relative to the remainder of the display screen, for example, by changing the quantity of data representing an object displayed within the attended sub-region relative to an object displayed in an unattended sub-region of the display screen. | 06-14-2012 |
20120146892 | INFORMATION INPUT DEVICE AND INFORMATION INPUT METHOD - An information input device includes: an applying element having a ring shape so that an finger of a user is inserted into the applying element; a receiving element having a ring shape so that an finger of a user is inserted into the receiving element and disposed adjacent to the applying element in an extending direction of a center line of the applying element; a signal generating element generating and transmitting a waveform signal to the apply element so that the applying element outputs a measurement signal; and a signal extracting element extracting a signal relating to a posture of the finger from a reception signal, which is output from the receiving element based on the measurement signal received by the receiving element. | 06-14-2012 |
20120146893 | INTERCHANGEABLE OVERLAY FOR AMUSEMENT DEVICES - An overlay for a nonportable amusement device is provided. The nonportable amusement device has a housing, a display, a memory and a controller. The overlay includes one or more panels selectively attachable to and removable from the housing of the nonportable amusement device. | 06-14-2012 |
20120146894 | MIXED REALITY DISPLAY PLATFORM FOR PRESENTING AUGMENTED 3D STEREO IMAGE AND OPERATION METHOD THEREOF - Various 3D image display devices divide and share a physical space for expressing a 3D image, and real-time contents information is generated based on user information and information on the divided space and displayed together using various 3D image display devices to present a 3D image naturally in a deeper, wider, and higher space. A mixed reality display platform includes an input/output controller controlling display devices including 3D display devices, an advance information manager establishing 3D expression space for each display device to divide or share a physical space by collecting spatial establishment of the display device, and a real-time information controller generating real-time contents information using user information and 3D contents for a virtual space. The input/output controller distributes the real-time contents information to each display device based on the 3D expression spatial information. | 06-14-2012 |
20120146895 | ARRANGEMENT, METHOD AND COMPUTER PROGRAM FOR CONTROLLING A COMPUTER APPARATUS BASED ON EYE-TRACKING - A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request. | 06-14-2012 |
20120146896 | Continuous Determination of a Perspective - In a method and operating element for establishing an angle of view for an observer with respect to a two or three-dimensional object, which is displayed on an output device, the establishment of the angle of view takes place by control on a simple circular disc. A point on the disc is converted to a position on a virtual sphere. The respective angle of view with respect to the object is established by the axis, determined by the calculated position on the virtual sphere and the sphere center. | 06-14-2012 |
20120146897 | THREE-DIMENSIONAL DISPLAY - A light ray controller has a circular frustum shape and is fitted in a circular hole of a top board such that its large diameter bottom opening faces upward. The light ray controller transmits a light ray while diffusing the light ray in a ridgeline direction and transmits the light ray straightforward without diffusing the light ray in a circumferential direction. A rotation module is provided under the table. One or more scanning projectors are provided on a circumference around an axis of the light ray controller on the rotation base of the rotation module. One or more scanning projectors are rotated by the rotation module. The controller controls one or more scanning projectors being rotated based on three-dimensional shape data stored in a storage. | 06-14-2012 |
20120154264 | USER INTERFACES FOR DIALYSIS DEVICES - In general, a dialysis device includes a first processing device for monitoring dialysis functions of the dialysis device, a second processing device, a display device, and memory. The memory is configured to store instructions that, when executed, cause the dialysis device to provide, on the display device, a first display region and a second display region, where the first display region is associated with the first processing device and the second display region is associated with the second processing device. At least a portion of the first display region cannot be obscured by the second display region. | 06-21-2012 |
20120154265 | MOBILE TERMINAL AND METHOD OF CONTROLLING A MODE SCREEN DISPLAY THEREIN - A mobile terminal including a communication unit configured to communicate with at least one external terminal; a memory configured to store at least a first and second operating system including at least first and second modes, respectively; and a controller configured to execute the first operating system and to activate the first mode corresponding to the first operating system, to display a first information screen corresponding to the activated first mode on a display of the mobile terminal, to receive an event signal indicating an event related to the second mode has occurred on the mobile terminal, and to selectively display event information related to the event of the second mode on a display of the at least one external terminal. | 06-21-2012 |
20120154266 | APPARATUS AND METHOD FOR CONTROLLING DATA IN PORTABLE TERMINAL - Provided is an apparatus and method for controlling data in a portable terminal, which can control data in the portable terminal while minimizing the motion of both hands of a user of the portable terminal. An apparatus for controlling data in a portable terminal includes a plurality of sensors disposed at the sides of the portable terminal to detect data control actions, and a control unit for controlling the data according to the data control actions detected by the plurality of sensors. | 06-21-2012 |
20120154267 | Sphere-Like Input Device - Embodiments of the invention provide a human interface device including an inner sphere, wherein the inner sphere has a center point. The human interface device can further include an outer sphere, and the outer sphere may be compressible. The human interface device may also include a plurality of pressure sensors between the inner sphere and the outer sphere for detecting localized compression of the outer sphere, a first three-axis-accelerometer located within the inner sphere, and a second three-axis-accelerometer located within the inner sphere, wherein the first three-axis-accelerometer and the second three-axis-accelerometer-accelerometer are each located at least a predetermined distance from the center point. | 06-21-2012 |
20120154268 | REMOTE CONTROL SYSTEMS THAT CAN DISTINGUISH STRAY LIGHT SOURCES - Remote control systems that can distinguish predetermined light sources from stray light sources, e.g., environmental light sources and/or reflections are provided. The predetermined light sources can be disposed in asymmetric substantially linear or two-dimensional patterns. The predetermined light sources also can be configured to exhibit signature characteristics. The predetermined light sources also can output light at different signature wavelengths. The predetermined light sources also can emit light polarized in one or more predetermined polarization axes. Remote control systems of the present invention also can include methods for adjusting an allocation of predetermined light sources and/or the technique used to distinguish the predetermined light sources from the stray light sources. | 06-21-2012 |
20120154269 | COORDINATE INFORMATION UPDATING DEVICE AND COORDINATE INFORMATION GENERATING DEVICE - An object can be displayed on a screen of a two-dimensional coordinate system based on xyz-coordinate values of the object in a three-dimensional coordinate system, operation information of a two-dimensional coordinate system with respect to the object can be received from an input device, and whether the operation information is in accordance with a predetermined rule or not is determined. If the operation information is not in accordance with the predetermined rule, xy-coordinate values of the object can be updated in accordance with the operation information. If the operation information is in accordance with the predetermined rule, a z-coordinate value of the object can be updated in accordance with the operation information. | 06-21-2012 |
20120154270 | DISPLAY DEVICE - Disclosed is a display device provided with a display panel ( | 06-21-2012 |
20120154271 | METHOD, MEDIUM AND APPARATUS FOR BROWSING IMAGES - A method, medium and apparatus browsing images is disclosed. The method browsing images includes sensing acceleration imparted to a portable digital device, and moving an image onto a display area in accordance with a tilt angle of the portable digital device if the sensed acceleration is greater than a first threshold value. | 06-21-2012 |
20120154272 | MOTION-BASED TRACKING - Techniques are disclosed for determining a user's motion in relation to displayed images. According to one general aspect, a first captured image is accessed. The first captured image includes (1) a first displayed image produced at a first point in time, and (2) a user. A second captured image is accessed. The second captured image includes (1) a second displayed image produced at a second point in time, and (2) the user. First information indicating motion associated with one or more objects in the first and second displayed images is accessed. Second information indicating both motion of the user and the motion associated with the one or more objects in the first and second displayed images is determined. | 06-21-2012 |
20120162057 | SENSING USER INPUT USING THE BODY AS AN ANTENNA - A human input system is described herein that provides an interaction modality that utilizes the human body as an antenna to receive electromagnetic noise that exists in various environments. By observing the properties of the noise picked up by the body, the system can infer human input on and around existing surfaces and objects. Home power lines have been shown to be a relatively good transmitting antenna that creates a particularly noisy environment. The human input system leverages the body as a receiving antenna and electromagnetic noise modulation for gestural interaction. It is possible to robustly recognize touched locations on an uninstrumented home wall using no specialized sensors. The receiving device for which the human body is the antenna can be built into common, widely available electronics, such as mobile phones or other devices the user is likely to commonly carry. | 06-28-2012 |
20120162058 | SYSTEMS AND METHODS FOR SHARED DISPLAY IN A HYBRID ENVIRONMENT - Embodiments operating shared peripherals in a hybrid computing system are described. Embodiments control one or more shared peripheral devices variously between a primary system and a secondary system via one or more communication links, where the secondary system is detachable from the primary system and operates as an independent computing device in the disconnected state, while operating as a display device in the connected state. | 06-28-2012 |
20120162059 | HANDWRITING INPUT DEVICE WITH CHARGER - A handwriting input device includes a hollow charger and a handwriting pen. The charger includes a number of first contacts and a connecting unit. The handwriting pen includes a number of second contacts, a communication unit, a light source to emit light, an optical sensor, and a microcontroller. When the charger is connected to an external electronic device through the connecting unit, and the handwriting pen is inserted into the charger to cause each second contact to be contacted with one first contact, the handwriting pen can obtain power from the external electronic device. When the charger is moved, together with the handwriting pen being charged, the optical sensor converts the light reflected from a reflective surface under the charger into the electrical signals. The microcontroller generates and transmits the cursor control signals or the track control signals to the external electronic device. | 06-28-2012 |
20120162060 | LASER NAVIGATION MODULE - Disclosed herein is a laser navigation module, including: a light source radiating a laser beam; a housing provided with a window transmitting and reflecting the laser beam radiated from the light source and shielding the introduction of visible rays and provided with a transparent part or a translucent part to radiate light radiated from the inside thereof to the outside; a lighting device mounted in the housing; and a light diffusing member transferring the light radiated from the lighting device and having a groove part formed at an edge of an opposite side of the lighting device. | 06-28-2012 |
20120162061 | ACTIVATION OBJECTS FOR INTERACTIVE SYSTEMS - An interactive system can include a display device, a control panel, a projector, a processing device, and an input device. The input device can detect indicia of its posture relative to the control panel or a display surface of the display device, so that the interactive system can recognize interactions between the input device and these components. Interactions between the input device and the display surface can be captured, analyzed by the processing device, and then represented in an image projected onto the display surface. The control panel can comprise a plurality of non-projected, tactile activation objects, each of which can correspond to an activity or function of the interactive system, such as, for example, powering on the projector. When the interactive system detects an interaction between the input device and an activation object, the activity corresponding to the selected activation object can be performed. | 06-28-2012 |
20120162062 | ELECTRONIC BOOK READING DEVICE WITH AIRFLOW SENSOR - An electronic book reading device includes a housing, an airflow sensor and a controller. The airflow sensor includes a turbine, a shaft fixed to the turbine and a tachometer. The turbine is configured for sensing an airflow and rotating with and about the shaft. The tachometer is connected to the shaft and configured for detecting a rotation speed of the turbine. The controller is configured for determining whether the detected rotation speed of the turbine falls in a predetermined range, and controlling the electronic book reading device to execute a predetermined operation if the detected rotation speed of the turbine falls in the predetermined range. | 06-28-2012 |
20120162063 | INFORMATION DISPLAY APPARATUS, INFORMATION DISPLAY METHOD, AND STORAGE MEDIUM STORING PROGRAM FOR DISPLAYING INFORMATION - An information display method, includes inputting a designated point in each of the series of image data items of the image file, which is sequentially designated as the analysis target point by a user's manipulation, every time each of the series of image data items of the image file is sequentially displayed on the display screen, displaying each of the series of designated points, which are sequentially input, at each input position, sequentially displaying each of the series of image data items stored in the designated image file along with each of the series of designated points while each of the series of designated points is displayed, and displaying a designated point after a change in a display mode different from that of a designated point before the change, every time a transition direction of each of the series of designated points changes. | 06-28-2012 |
20120162064 | COORDINATE SENSOR AND DISPLAY DEVICE - The amount of the light emitted from a light-emitting diode is changed from that of the initial condition to increase, among the light sensors that are not shielded from the light emitted from the light-emitting diode, the number of the light sensors at which the difference (C−D) between the light detection amount at the light sensor when the light-emitting diode is ON and the light detection amount at the light sensor when the light-emitting diode is OFF is equal to the difference (A−B) between the light detection amount at the light sensor in the initial condition when the light-emitting diode was ON and the light detection amount at the light sensors in the initial condition when the light-emitting diode was OFF with no detection object present. As a result, regardless of the presence or absence of the detection object, the coordinate sensor and the display device disclosed can establish a threshold for determining the presence or absence of the detection object, and can also determine the coordinates of the detection object in a stable manner regardless of changes in the ambient environmental light or in the ambient environmental temperature, or fluctuations in the amount of light emitted from the light-emitting element disposed in the coordinate sensor or the change in the sensitivity of the light-receiving elements. | 06-28-2012 |
20120162065 | SKELETAL JOINT RECOGNITION AND TRACKING SYSTEM - A system and method are disclosed for recognizing and tracking a user's skeletal joints with a NUI system and further, for recognizing and tracking only some skeletal joints, such as for example a user's upper body. The system may include a limb identification engine which may use various methods to evaluate, identify and track positions of body parts of one or more users in a scene. In examples, further processing efficiency may be achieved by segmenting the field of view in smaller zones, and focusing on one zone at a time. Moreover, each zone may have its own set of predefined gestures which are recognized. | 06-28-2012 |
20120162066 | METHODS AND SYSTEMS FOR PROVIDING SENSORY INFORMATION TO DEVICES AND PERIPHERALS - Peripherals and data processing systems are disclosed which can be configured to interact based upon sensor data. In at least certain embodiments, a method for sensing motion and orientation information for a device includes receiving a motion event from at least one sensor located in a device. The method further includes determining an orientation for a display of the device. The method further includes determining whether the device is currently moving. The method further includes determining whether the device moves within an angle with respect to a ground reference for a first time period. The method further includes switching the orientation of the display of the device if the device moves in excess of the angle. | 06-28-2012 |
20120169582 | SYSTEM READY SWITCH FOR EYE TRACKING HUMAN MACHINE INTERACTION CONTROL SYSTEM - The invention relates to a system and method for activating a visual control interface, and in particular, for activating a visual control interface using an eye tracking system in a vehicle. A switch is used to activate and deactivate a control section of an eye tracking system in a human-machine interaction control system. The system allows a driver (or operator) of a vehicle to signal the system with selection of the switch to activate or deactivate the control section, thereby providing functional support to the driver when desired, but remaining inconspicuous otherwise. | 07-05-2012 |
20120169583 | SCENE PROFILES FOR NON-TACTILE USER INTERFACES - A method, including capturing an image of a scene including one or more users in proximity to a display coupled to a computer executing a non-tactile interface, and processing the image to generate a profile of the one or more users. Content is then selected for presentation on the display responsively to the profile. | 07-05-2012 |
20120169584 | AIR CONDITIONING APPARATUS AND A METHOD FOR CONTROLLING AN AIR CONDITIONING APPARATUS - An air conditioning apparatus and a method for controlling an air conditioning apparatus are provided. The method may include turning on an image capturing device provided in an indoor device; recognizing a gesture identifier moved in front of the image capturing device; inputting a gesture according to a movement of the gesture identifier; analyzing the input gesture; extracting operating condition(s) corresponding to the analyzed gesture; and driving the indoor device according to the extracted operating condition(s). | 07-05-2012 |
20120169585 | ELECTRONIC SHELF LABEL AND METHOD OF DISPLAYING REMAINING BATTERY LIFE THEREOF - An electronic shelf label system is provided. There is provided a method in which an electronic shelf label periodically transmits its remaining battery capacity to a server in a electronic shelf label system according to the present invention, and the server converts the battery level into a remaining battery life (time) and provides the same, so that a manager can check the remaining battery life in a management mode of the server and a terminal, and easily manage an electronic shelf label using a battery. | 07-05-2012 |
20120169586 | VIRTUAL INTERFACE - A virtual interface including a virtual screen generator configured to produce a virtual display screen, a display generator configured to project at least one virtual display element onto the virtual display screen, and at least one sensor configured to detect interaction with the at least one virtual display element. | 07-05-2012 |
20120169587 | ELECTRONIC DEVICE AND CONTROL METHOD FOR THE SAME - An electronic device of the present invention includes an input acceptance section that accepts an input of a command that causes any one of a plurality of operation states to be selected; a function section ( | 07-05-2012 |
20120169588 | APPARATUS AND METHOD FOR ADJUSTING FOR INPUT LATENCY IN AN ELECTRONIC DEVICE - Embodiments of methods, apparatuses, devices and systems associated with adjusting for input latency within an electronic are disclosed. An electronic device may receive a user input, such as a user actuation of a device key. A latency adjusted time of the input may be calculated based, at least in part, on a latency of the electronic device in determining the user actuation of the device key. The latency adjusted time may be used to determine a result of the user input. | 07-05-2012 |
20120169589 | Sphere-Like Input Device - Embodiments of the invention provide a human interface device including an inner sphere, wherein the inner sphere has a center point. The human interface device can further include an outer sphere, and the outer sphere may be compressible. The human interface device may also include a plurality of pressure sensors between the inner sphere and the outer sphere for detecting localized compression of the outer sphere, a first three-axis-accelerometer located within the inner sphere, and a second three-axis-accelerometer located within the inner sphere, wherein the first three-axis-accelerometer and the second three-axis-accelerometer-accelerometer are each located at least a predetermined distance from the center point. | 07-05-2012 |
20120169590 | INFORMATION-PROCESSING SYSTEM, INFORMATION-PROCESSING APPARATUS, AND INFORMATION-PROCESSING METHOD - In an information-processing system shown in FIG. | 07-05-2012 |
20120169591 | DISPLAY DEVICE AND CONTROL METHOD THEREFOR - One embodiment provides a display device, including: a luminous flux generator which generates luminous flux including image information; a reflector plate which reflects the luminous flux generated by the luminous flux generator toward one eye of a viewer; a head detector which detects a head of the viewer by using at least two pairs of distance sensors; a controller which controls a position of the reflector plate based on an output from the head detector; and a driver which drives the reflector plate based on an output from the controller. | 07-05-2012 |
20120169592 | INFORMATION PROCESSING DEVICE, METHOD FOR CONTROLLING INFORMATION PROCESSING DEVICE, PROGRAM, AND INFORMATION STORAGE MEDIUM - Direction obtaining means ( | 07-05-2012 |
20120176302 | OPTIMIZED ON-SCREEN KEYBOARD RESPONSIVE TO A USER'S THOUGHTS AND FACIAL EXPRESSIONS - A system and method that allows for control of an on-screen keyboard in response to brain activity input, even for those suffering from partial paralysis or locked-in syndrome. The system provides an on-screen keyboard arranged into blocks of keys for ease of navigation. In response to brain activity input, such as activity relating to thoughts or facial expressions, the system performs a variety of previously determined functions which control the on-screen keyboard. In response to the activity of the on-screen keyboard, the computer can be controlled in a manner similar to when using a standard hand-operated keyboard. | 07-12-2012 |
20120176303 | GESTURE RECOGNITION APPARATUS AND METHOD OF GESTURE RECOGNITION - A gesture recognition apparatus ( | 07-12-2012 |
20120176304 | PROJECTION DISPLAY APPARATUS - A projection display apparatus includes: an element controller that controls an imager to display a maximum light amount image and a minimum light amount image; a setting unit that sets a light amount threshold value used to detect pointing light emitted from a pointing device; and a specifying unit that specifies a light amount of the pointing light overlapping the minimum light amount image, based on a picked-up image, and to specify a light amount of the maximum light amount image, based on the picked-up image, wherein the setting unit sets the light amount threshold value, based on the specified light amount of the pointing light and the specified light amount of the maximum light amount image. | 07-12-2012 |
20120176305 | DISPLAY APPARATUS CONTROLLED BY A MOTION, AND MOTION CONTROL METHOD THEREOF - A display apparatus includes a motion recognition unit which recognizes a movement of an object located outside the display apparatus, a storage unit which stores therein information about the operation corresponding to each motion; and a control unit which divides a movement recognition period using a movement nonrecognition period, determines a motion corresponding to a movement of the object within the movement recognition period, and performs an operation corresponding to the determined motion according to information stored in the storage unit. | 07-12-2012 |
20120176306 | SYSTEM AND METHOD FOR PROVIDING SUBSTANTIALLY STABLE HAPTICS - A system for providing substantially stable haptics includes at least one computer configured to identify a first subset and a second subset of haptic interaction geometric primitives for a virtual tool. The computer is configured to determine based on the first subset, haptic forces in a first subspace. The computer is also configured to determine based on the second subset, haptic forces in a second subspace different from the first subspace. | 07-12-2012 |
20120176307 | TELEPHONE BOOK DATA PROCESSOR - A telephone book data processor includes: a connection element for connecting to an external device via a short range communication manner to transfer a telephone book data; a telephone book data obtaining element for obtaining the telephone book data; a memory having multiple memory regions for storing the telephone book data; and a controller for executing a telephone book data transfer process and a telephone book data utilizing process. The controller defines one memory region as an object of the telephone book data transfer process and another memory region as an object of the telephone book data utilizing process. The controller executes the telephone book data utilizing process with using the telephone book data in the another memory region while the controller executes the telephone book data transfer process for storing a new telephone book data in the one memory region. | 07-12-2012 |
20120176308 | METHOD FOR SUPPORTING MULTIPLE MENUS AND INTERACTIVE INPUT SYSTEM EMPLOYING SAME - A method comprises receiving an input event associated with a first user ID, the input event being a command for displaying a first menu on a display surface; identifying a second menu associated with the first user ID currently being displayed on the display surface; dismissing the second menu; and displaying the first menu. | 07-12-2012 |
20120182210 | INTELLIGENT REAL-TIME DISPLAY SELECTION IN A MULTI-DISPLAY COMPUTER SYSTEM - A system and computer-implemented method for managing a plurality of display devices in a multi-display computer system that includes determining in real-time input information including face direction of a user facing the plurality of display devices, selecting a primary display device of the plurality of display devices using the input information determined, and transferring information to the primary display device as desired by the user. | 07-19-2012 |
20120182211 | DEVICE AND METHOD OF CONVEYING EMOTION IN A MESSAGING APPLICATION - The present disclosure provides a device and method to convey emotions in a messaging application of a mobile electronic device. An emotional context of text entered into the messaging application is determined and an implied emotional text is presented for at least a portion of the entered text in accordance with the determined emotional context. The emotional context may be determined from captured sensor data captured by one or more sensors. | 07-19-2012 |
20120182212 | INPUT DEVICE FOR COMPUTER SYSTEM - An input device for a computer system includes a command-input unit, a storage unit and a micro-controller unit. The command-input unit is used for generating a control signal. The storage unit is for reading and storing a first data. The micro-controller unit is electrically connected with the command-input unit and the storage unit fore receiving the control signal or the first data and transmitting the control signal or the first data to a computer system according to a Wi-Fi direct protocol, and receiving a second data from the computer system according to the Wi-Fi direct protocol and transmitting the second data to the storage unit. | 07-19-2012 |
20120182213 | Control device with wire control for monitor device - A control device with wire control for a monitor, which includes a control body; an audio plug arranged for plugging into an audio jack of the monitor for controlling one or more functions of the monitor through one or more control button; and a resilience and extendable connecting wire electrically connecting between the control body and the audio plug. Accordingly, a user can control different functions of the monitor at a distance from the position at which the monitor is located which is convenience to the user. In addition, the present invention is easy and convenience to carry and is cost-effective. | 07-19-2012 |
20120182214 | Position Detecting System and Position Detecting Method - A position detecting system includes an indicating device including a photodiode that detects timing of a light pulse and a transmission unit that transmits a signal representing the timing, a plurality of light sources that emit light pulses to areas acquired by dividing a screen into a plurality of parts at timings unique to the light sources, a reception unit that receives the signal representing the timing, and a control unit that detects a position of the indicating device based on the signal representing the timing. | 07-19-2012 |
20120182215 | SENSING MODULE, AND GRAPHICAL USER INTERFACE (GUI) CONTROL APPARATUS AND METHOD - A sensing module, and a Graphical User Interface (GUI) control apparatus and method are provided. The sensing module may be inserted into an input device, for example a keyboard, a mouse, a remote controller, and the like, and may sense a hovering movement of a hand of a user within a sensing area, and thus it is possible to provide an interface to control a wider variety of GUIs, and possible to prevent a display from being covered. | 07-19-2012 |
20120188153 | MULTI-BEND DISPLAY ACTIVATION ADAPTATION - Systems and methods determine likely unintended flexing of a flexible display and exclude the determined unintended flexings from user input processing. Unintended flexings include placing or removing the flexible display into or out of a compact storage configuration, folds that outside the user's visible area, folds that are near edges and boundaries, flexing with a specified degree of bending or orientation, folds that don't intersect with other folds, folds that are near known unintended folds, folds that have a motion or other variation with time, and folds that are not in proximity to a selectable user interface element. Unintended flexings are adaptively identified by determining that a bend that is not in proximity to a selectable user interface element reoccurs at times when icons at different locations are presented on the flexible display. | 07-26-2012 |
20120188154 | METHOD AND APPARATUS FOR CHANGING A PAGE IN E-BOOK TERMINAL - A method and apparatus for changing a page when an e-book terminal is inclined is provided. The method includes sensing that the e-book terminal is inclined to a left or right side, then changing a current page to a next page when the e-book terminal is inclined to the left side or changing the current page to a previous page when the e-book terminal is inclined to the right side. | 07-26-2012 |
20120188155 | METHOD AND APPARATUS FOR CONTROLLING DEVICE - A method of controlling a device is provided including identifying a registered device from a screen input by a camera, receiving a user input for the identified device, and transmitting a control command corresponding to the input to the identified device. | 07-26-2012 |
20120188156 | OPERATION MEMBER PROVIDED IN ELECTRONIC DEVICE, AND ELECTRONIC DEVICE - An operation member and an electronic device capable of maintaining operability while enhancing cushioning properties provided in the outer surface of an operation member are provided. An operation stick has a cushion portion and a base portion on which the cushion portion is placed. The base portion is supported to be movable. The base portion has a frame portion surrounding the outer periphery of the cushion portion. The base portion and the frame portion are formed of a material having a higher rigidity than that of the material of the cushion portion. | 07-26-2012 |
20120188157 | METHOD AND APPARATUS FOR EVOKING PERCEPTIONS OF AFFORDANCES IN VIRTUAL ENVIRONMENTS - Methods and apparatus are provided for evoking perceptions of affordances in a user/virtual environment interface. The method involves recognizing the absence or inadequacy of certain sensory stimuli in the user/virtual environment interface, and then creating sensory stimuli in the virtual environment to substitute for the recognized absent or inadequate sensory stimuli. The substitute sensory stimuli are typically communicated to the user (e.g., visually and/or audibly) as properties and behavior of objects in the virtual environment. Appropriately designed substitute sensory stimuli can evoke perceptions of affordances for the recognized absent or inadequate sensory stimuli in the user/virtual environment interface. | 07-26-2012 |
20120188158 | WEARABLE ELECTROMYOGRAPHY-BASED HUMAN-COMPUTER INTERFACE - A “Wearable Electromyography-Based Controller” includes a plurality of Electromyography (EMG) sensors and provides a wired or wireless human-computer interface (HCl) for interacting with computing systems and attached devices via electrical signals generated by specific movement of the user's muscles. Following initial automated self-calibration and positional localization processes, measurement and interpretation of muscle generated electrical signals is accomplished by sampling signals from the EMG sensors of the Wearable Electromyography-Based Controller. In operation, the Wearable Electromyography-Based Controller is donned by the user and placed into a coarsely approximate position on the surface of the user's skin. Automated cues or instructions are then provided to the user for fine-tuning placement of the Wearable Electromyography-Based Controller. Examples of Wearable Electromyography-Based Controllers include articles of manufacture, such as an armband, wristwatch, or article of clothing having a plurality of integrated EMG-based sensor nodes and associated electronics. | 07-26-2012 |
20120194415 | DISPLAYING AN IMAGE - Devices, methods, and systems for displaying an image are described herein. One or more device embodiments include a user interface configured to display an image, a motion sensor configured to sense movement of the device, and a processor configured to convert the movement of the device to a corresponding movement of the display of the image. | 08-02-2012 |
20120194416 | ELECTRONIC APPARATUS AND METHOD OF CONTROLLING ELECTRONIC APPARATUS - One embodiment provides an electronic apparatus, including: a direction detector disposed in a housing which accommodates electronic components therein and configured to detect a direction of the housing; and a power-saving-mode shift controller configured to control a shift to a power saving mode upon detection of a turned-down direction of the housing. | 08-02-2012 |
20120194417 | INPUT DEVICE WITH SWING OPERATION - An input device with swing operation includes a supporting frame, a flexible printed circuit installed on the supporting frame for outputting a signal, a supporting base fixed on the supporting frame, a cap pivoted to the supporting base, and a hook respectively pivoted to the supporting base and the cap. An inclined angle is formed between the hook and the supporting frame when the cap is not pressed down. The hook and the cap pivots relative to the supporting base when the cap is pressed down. The input device further includes a resilient component disposed between the flexible printed circuit and the cap for being pressed by the cap to actuate the flexible printed circuit when the cap is pressed down. | 08-02-2012 |
20120194418 | AR GLASSES WITH USER ACTION CONTROL AND EVENT INPUT BASED CONTROL OF EYEPIECE APPLICATION - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes user action control and event input based control of an eyepiece application. | 08-02-2012 |
20120194419 | AR GLASSES WITH EVENT AND USER ACTION CONTROL OF EXTERNAL APPLICATIONS - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes event and user action control of external applications. | 08-02-2012 |
20120194420 | AR GLASSES WITH EVENT TRIGGERED USER ACTION CONTROL OF AR EYEPIECE FACILITY - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes event triggered user action control. | 08-02-2012 |
20120194421 | TERMINAL AND METHOD FOR PROVIDING USER INTERFACE LINKED TO TERMINAL POSTURE - A terminal and method for providing user interfaces linked to postures of a terminal, whereby a key input signal is received from any one of function keys located at both sides of the terminal, a posture of the terminal is determined in response to the received key input signal, a user interface screen corresponding to the determined posture of the terminal is provided, and settings of the terminal are changed according to the determined posture of the terminal, thereby increasing convenience of a user interface without use of a complicated sensor system. | 08-02-2012 |
20120194422 | METHOD AND SYSTEM FOR VISION-BASED INTERACTION IN A VIRTUAL ENVIRONMENT - Method, computer program and system for tracking movement of a subject. The method includes receiving data from a distributed network of camera sensors employing one or more emitted light sources associated with one or more of the one or more camera sensors to generate a volumetric three-dimensional representation of the subject, identifying a plurality of clusters within the volumetric three-dimensional representation that correspond to motion features indicative of movement of the motion features of the subject, presenting one or more objects on one or more three dimensional display screens, and using the plurality of fixed position sensors to track motion of the motion features of the subject and track manipulation of the motion features of the volumetric three-dimensional representation to determine interaction of one or more of the motion features of the subject and one or more of the one or more objects on the three dimensional display. | 08-02-2012 |
20120194423 | MULTI COLOURS DEVICE ILLUMINATION - A portable device having within it a multicolour illumination arrangement comprising: a surface; a plurality of light sources, at least one of the plurality of light sources being capable of generating two or more emission colours; and drive means for causing the emission colour of the at least one light source to vary; whereby the illumination arrangement can produce a varying illumination through at least part of the surface. | 08-02-2012 |
20120194424 | SYSTEM AND METHOD FOR NAVIGATING A MOBILE DEVICE USER INTERFACE WITH A DIRECTIONAL SENSING DEVICE - An electronic mobile device includes a display for displaying a graphical element. A tilt sensor is configured to sense first and second tilt angles of the mobile device. A processor is coupled to the display and the tilt sensor and configured to move the graphical element relative to the display in a first direction based on the first tilt angle, and to move the graphical element relative to the display in a second direction based on the second tilt angle. | 08-02-2012 |
20120194425 | INTERACTIVE SELECTION OF A REGION OF INTEREST IN AN IMAGE - A system for selecting a region of interest in an image is provided. A user interface ( | 08-02-2012 |
20120200486 | Infrared gesture recognition device and method - A system for generating tracking coordinate information in response to movement of an information-indicating element includes an array ( | 08-09-2012 |
20120200487 | METHOD AND APPARATUS FOR A MULTI-PAGE ELECTRONIC CONTENT VIEWER - An apparatus and method of displaying content on a multi-page electronic content viewer is provided. Sensor input associated with at least one display panel from a plurality of displays panels infinitely rotatable around a common edge of a binding element is determined. The movement is used to determine a direction of movement of the at least one display panel around an axis of the binding element based upon the sensor input. A request is then sent for content to a storage device for content to be displayed on the next viewable display. The request includes a direction indicator determined by the direction of movement of the at least one display panel around the binding element. Content is then received and displayed on display screens of one or more of the plurality of display panels. | 08-09-2012 |
20120200488 | AR GLASSES WITH SENSOR AND USER ACTION BASED CONTROL OF EYEPIECE APPLICATIONS WITH FEEDBACK - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes sensor and user action based control of eyepiece applications with feedback. | 08-09-2012 |
20120200489 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, INFORMATION PROCESSING PROGRAM, AND INFORMATION PROCESSING SYSTEM - An information processing device includes a controller that decides information to be displayed on a second display section based on an attribute of an operation subject displayed on a first display section when the operation subject is operated, and a communication section that transmits control information for causing the second display section to display the information decided by the controller to a device having the second display section. | 08-09-2012 |
20120200490 | GAZE DETECTION APPARATUS AND METHOD - A gaze detection apparatus is disclosed. The gaze detection apparatus includes: a gaze detection section that detects gaze of a target person; a display section that includes a display screen for displaying an image; a gaze position determination section that determines, based on a result of detection by the gaze detection section, whether or not the display screen lies in the gaze of the target person; and a first display control section that displays a first detection result image at an intersection point between the gaze of the target person and the display screen when the first gaze position determination section determines that the display screen lies in the gaze of the target person. | 08-09-2012 |
20120200491 | GESTURE CATALOGING AND RECOGNITION - Methods and apparatus for cataloging and recognizing gestures are disclosed. A gesture may be detected using sample motion data. An energy value and baseline value may be computed. The baseline value may be updated if the energy value is below a calm energy threshold. The sample motion data may be adjusted based on the updated baseline value. A local variance may be calculated over a number of samples. Sample motion data values may be recorded if the local variance exceeds a threshold. Sample motion data recording may stop if a local variance scalar value falls below a drop threshold. Input Gestures may be recognized by computing a total variance for sample values in an Input Gesture; calculating a figure of merit using sample values from the Input Gesture and one or more Catalog Gestures; and determining whether the Input Gesture matches a Catalog Gesture from the figure of merit. | 08-09-2012 |
20120200492 | Input Method Applied in Electronic Devices - An input method applicable for inputting into an electronic device, which includes the steps of capturing a lip motion of a person; receiving an image of the lip motion; encoding the lip motion image to obtain a lip motion code; comparing the lip motion code with a plurality of standard lip motion codes to obtain a first text result matching the lip motion code; and displaying the first text result on the electronic device if the first text result is obtained. If the first text result is not obtained, the method may further include activating an auxiliary analyzing mode for the electronic device for recognizing a facial expression, a hand gesture, or an audio signal to be inputted. The input method can diversify input methods for the electronic device. | 08-09-2012 |
20120200493 | CONTROLLING DEVICE WITH SELECTIVELY ILLUMINATED USER INTERFACES - A controlling device using a source of energy, such as light energy, to provide the controlling device with a user interface having multiple, different visual appearances. | 08-09-2012 |
20120200494 | COMPUTER VISION GESTURE BASED CONTROL OF A DEVICE - A system and method are provided for controlling a device based on computer vision. Embodiments of the system and method of the invention are based on receiving a sequence of images of a field of view; detecting movement of at least one object in the images; applying a shape recognition algorithm on the at least one moving object; confirming that the object is a user hand by combining information from at least two images of the object; and tracking the object to control the device. | 08-09-2012 |
20120200495 | Autostereoscopic Rendering and Display Apparatus - An apparatus comprising a sensor configured to detect the position and orientation of a user viewpoint with respect to an auto-stereoscopic display; a processor configured to determine a surface viewable from the user viewpoint of at least one three dimensional object; and an image generator configured to generate a left and right eye image for display on the auto-stereoscopic display dependent on the surface viewable from the user viewpoint. | 08-09-2012 |
20120206329 | OPTICAL NAVIGATION MODULE WITH ALIGNMENT FEATURES - An optical navigation module comprising a rigid flange having a top surface. An optical navigation unit can be coupled to the top surface of the rigid flange with an electrical connection electrically coupled to the optical navigation unit. An alignment flange can be coupled to the rigid flange with the alignment feature including one or more alignment features. The alignment feature can be a hole adapted to receive an alignment pin to hold and align the optical navigation module. | 08-16-2012 |
20120206330 | MULTI-TOUCH INPUT DEVICE WITH ORIENTATION SENSING - A multi-touch orientation sensing input device may enhance task performance efficiency. The multi-touch orientation sensing input device may include a device body that is partially enclosed or completely enclosed by a multi-touch sensor. The multi-touch orientation sensing input device may further include an inertia measurement unit that is disposed on the device body, The inertia measurement unit may measures a tilt angle of the device body with respect to a horizontal surface, as well as a roll angle of the device body along a length-wise axis of the device body with respect to an initial point on the device body. | 08-16-2012 |
20120206331 | Methods and Systems for Supporting Gesture Recognition Applications across Devices - Embodiments herein disclose methods and systems for supporting gesture recognition capable applications to be supported across gesture recognition capable devices. Further disclosed are the marketplace, and network mechanisms to deliver applications and advertisements to multiple devices. A middleware is provided for a gesture recognition device that will provide the common API for a gesture recognition capable application. The same application may be used or played on any other device hosting the supported middleware specific to the device. Developers of applications are provided with a single SDK. The middleware also hosts a network application that will enable a user to search and download supported applications, provide feedback to the network, and provide configuration capabilities. | 08-16-2012 |
20120206332 | METHOD AND APPARATUS FOR ORIENTATION SENSITIVE BUTTON ASSIGNMENT - Methods and apparatus are provided for orientation sensitive configuration of a device. In one embodiment a method includes detecting a change in orientation of a device, and configuring a display of the device based on the change in orientation. The method may further include assigning operation of one or more buttons of the device based on the change in orientation, wherein a button of the device is assigned one or more functions not previously assigned to the button. The method may be performed by one or more of portable electronic device, media player, eReader, personal communication device, handheld computing device, imaging device, tablet computing device, imaging device, gaming device and computing devices in general. | 08-16-2012 |
20120206333 | VIRTUAL TOUCH APPARATUS AND METHOD WITHOUT POINTER ON SCREEN - Provided is a virtual touch apparatus and method for remotely controlling electronic equipment having a display surface. The virtual touch apparatus includes a three-dimensional coordinate calculator and a controller. The three-dimensional coordinate calculator extracts a three-dimensional coordinate of a user's body. The controller includes a touch location calculation unit and a virtual touch processing unit. The touch location calculation unit calculates a contact point coordinate where a straight line connecting between a first spatial coordinate and a second spatial coordinate meets the display surface using the first and second spatial coordinates received from the three-dimensional coordinate calculator. The virtual touch processing unit creates a command code for performing an operation corresponding to the contact coordinate received from the touch location calculation unit and inputs the command code into a main controller of the electronic equipment. | 08-16-2012 |
20120206334 | AR GLASSES WITH EVENT AND USER ACTION CAPTURE DEVICE CONTROL OF EXTERNAL APPLICATIONS - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes an event and user action capture device control of external applications. | 08-16-2012 |
20120206335 | AR GLASSES WITH EVENT, SENSOR, AND USER ACTION BASED DIRECT CONTROL OF EXTERNAL DEVICES WITH FEEDBACK - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes an event, sensor, and user action based direct control of external devices with feedback. | 08-16-2012 |
20120206336 | DETECTOR FOR OPTICALLY DETECTING AT LEAST ONE OBJECT - A detector ( | 08-16-2012 |
20120206337 | MULTIPLE CAMERA CONTROL SYSTEM - A multiple camera tracking system for interfacing with an application program is provided. The tracking system includes multiple cameras arranged to provide different viewpoints of a region of interest, and are operable to produce a series of video images. A processor is operable to receive the series of video images and detect objects appearing in the region of interest. The processor executes a process to generate a background data set from the video images, generate an image data set for each received video image, compare each image data set to the background data set to produce a difference map for each image data set, detect a relative position of an object of interest within each difference map, and produce an absolute position of the object of interest from the relative positions of the object of interest and map the absolute position to a position indicator associated with the application program. | 08-16-2012 |
20120206338 | OPERATIONAL INPUT DEVICE - An operational input device that outputs a signal corresponding to a displacement amount of an operational input, includes a coil annularly extending from a first side toward a second side; a core configured to vary the inductance of the coil by being moved within the coil along an axis of the coil by the operational input applied from the first side toward the second side; and a yoke provided at an end surface of the coil at the second side and provided with an opening at a position facing an end surface of the core at the second side. | 08-16-2012 |
20120206339 | CONTROL USING MOVEMENTS - A movement of an object is recognised as a predetermined movement, by transmitting signals between transmitter-receiver pairs, which are reflected from the object. A first event is recorded for one of the transmitter-receiver pairs if a reflected signal meets a predetermined proximity criterion, and a second event is recorded for a second transmitter-receiver pair if after the first event, a subsequent reflected signal meets a predetermined proximity criterion. The first and second events are used to identify the movement. | 08-16-2012 |
20120206340 | DISPLAY METHOD AND DISPLAY APPARATUS - To provide a display method capable of displaying the original power saving effect if a monitoring display of the power saving effect is not made when the monitoring display of the power saving effect is made. Provided is a display method including capturing an image in front of an image display surface included in a display apparatus that displays the image and detecting presence of a moving body positioned in front of the image display surface, deciding a power saving amount of the display apparatus including the power saving amount of the image display surface by using an analysis result of the presence or absence of a human face and a detection result of the moving body in the captured image, and deriving and displaying the actual power saving amount when no display is made on the image display surface on the image display surface when information about the power saving amount is displayed on the image display surface. Accordingly, when a monitoring display of a power saving effect is made, an original power saving effect if no monitoring display of the power saving effect is made can be displayed. | 08-16-2012 |
20120206341 | Presenting Information on a Card with a Passive Electronic Paper Display - With a card including a passive electronic paper display configured to display a visual image, a method of presenting information on a card includes selectively changing a visual image displayed on the passive electronic paper display to update information represented by the visual image, and wherein the card is sized to be carried by a user. | 08-16-2012 |
20120206342 | SYSTEM AND METHOD FOR ACCESSING AND PRESENTING LIFE STORIES - A system to access information about a deceased individual includes a memory device and a mobile unit. The memory device contains a non-volatile computer readable medium to store the information on the deceased individual. The mobile unit interfaces with the memory device to access the information. The memory device may include a memory tube or a microchip. In some versions, the memory device may be mounted to a headstone or associated with the gravesite. The mobile unit may include a viewing screen to display information about the deceased individual via a user interface. In another version, a computer may host the information and an identifier is used by a mobile unit to access the information. The identifier may be associated with a burial site. In some versions, the identifier is an electronic identifier that communicatively couples to the computer. | 08-16-2012 |
20120206343 | MULTI-SOURCE PROJECTION-TYPE DISPLAY - A display device capable of displaying a plurality of projection images is provided. The display device includes a light source within a base and a plurality of projection outputs. Each projection output comprises an optical modulation device and a projection lens system. The light source includes a switch and a plurality of light sources such as lasers or LEDs with different color to one another. The switch receives and diverts light beams from the light sources in a predetermined sequential order to the plurality of projection outputs. | 08-16-2012 |
20120212403 | HANDHELD ELECTRONIC DEVICE AND FUNCTION CONTROL METHOD THEREOF - A handheld electronic device is provided. The electronic device includes an electrode unit, a storage unit, and a processing unit. The electrode unit includes a main body defining an annular cavity, a plurality of electrode groups, and a conductive element arranged within the annular cavity, wherein each of the plurality of the electrode groups includes a pair of conductive sheets, which are partially received in the annular cavity and are spaced apart from each other. When the electronic device is rotated to be in different orientation, the conductive element connects different electrode groups and the conductive sheets of the one of the electrode groups are connected to each other via the conductive element. The processing unit determines the connected electrode groups and executes a function corresponding to the determined electrode group. A function control method of the handheld electronic device is also provided. | 08-23-2012 |
20120212404 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING TERMINAL DEVICE, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND COMPUTER READABLE STORAGE MEDIUM - An information processing device includes an input information obtaining unit configured to obtain first input information and second input information among input information items, an input candidate information obtaining unit configured to obtain one or more input candidate information items as a candidate for the second input information, based on the first input information, an executing unit configured to carrying out an application, based on the one or more input candidate information items, to produce one or more image candidate information items, an image candidate information storage unit configured to store the one or more image candidate information items so as to be correlated to the one or more input candidate information items respectively, and a transmitting unit configured to send information relating to one or all of the one or more image candidate information items stored in the image candidate information storage unit to a terminal device. | 08-23-2012 |
20120212405 | SYSTEM AND METHOD FOR PRESENTING VIRTUAL AND AUGMENTED REALITY SCENES TO A USER - A method according to a preferred embodiment can include providing an embeddable interface for a virtual or augmented reality scene, determining a real orientation of a viewer representative of a viewing orientation relative to a projection matrix, and determining a user orientation of a viewer representative of a viewing orientation relative to a nodal point. The method of the preferred embodiment can further include orienting the scene within the embeddable interface and displaying the scene within the embeddable interface on a device. | 08-23-2012 |
20120212406 | AR GLASSES WITH EVENT AND SENSOR TRIGGERED AR EYEPIECE COMMAND AND CONTROL FACILITY OF THE AR EYEPIECE - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes event and sensor triggered command and control facility. | 08-23-2012 |
20120212407 | INFORMATION DISPLAY DEVICE AND SCROLL CONTROL METHOD - According to an aspect, an information display device includes a display unit, a visual line detector, and a control unit. The display unit displays information in a scroll region, which is provided in at least a part of a screen of the display unit, while scrolling the information. The visual line detector detects a visual line position of an operator with respect to the scroll region. The control unit controls a scroll speed, at which the information is scrolled in the scroll region, based on the visual line position detected by the visual line detector. | 08-23-2012 |
20120212408 | IMAGE GENERATION DEVICE, PROJECTOR, AND IMAGE GENERATION METHOD - An image generation device includes a storage unit that stores input object data indicating content information of an input object which is input on a presentation screen, a determination unit that determines a display position located further on an upper side than an input position of the input object on the presentation screen, and an image generation unit that generates a presentation image which is an image displayed on the presentation screen and where the input object is displayed according to the display position determined by the determination unit. | 08-23-2012 |
20120212409 | COMMUNICATION APPARATUS AND CONTROL METHOD - A communication apparatus includes a display unit, a communication unit, and a control unit. The display unit displays video data. The communication unit communicates with an external apparatus. The control unit controls the communication apparatus in accordance with a command received by the communication unit. If the display unit is in a mute state and the communication unit is sending the external apparatus the data for placing the external apparatus in a mute state, the control unit determines not to control the communication apparatus in accordance with the command. | 08-23-2012 |
20120212410 | OPERATION INPUT DEVICE - To provide an operation input device capable of realizing operation input by changing the posture of the operation input device even without a posture determining sensor. This is an operation input device for use by a user while being held by his/her hand, for obtaining a captured image captured by an image capturing unit provided to the operation input device, and determining the posture of the operation input device, based on the result of analysis on the captured image obtained. | 08-23-2012 |
20120218177 | METHOD AND APPARATUS FOR PROVIDING DIFFERENT USER INTERFACE EFFECTS FOR DIFFERENT MOTION GESTURES AND MOTION PROPERTIES - A method for providing a mechanism by which different user interface effects may be performed for different motion events may include receiving an indication of a motion event at a motion sensor, determining a motion gesture of the motion event, determining a motion property of the motion event, and enabling provision a user interface effect based on the motion property of the motion event. A corresponding apparatus and computer program product are also provided. | 08-30-2012 |
20120218178 | CHARACTER INPUT DEVICE - The present invention provides a character input device that can display character input information on a display screen with a small area and allows a user to input a desired character by a simple operation even in a case where it is necessary to select the desired character from plural characters to input the desired characters. | 08-30-2012 |
20120218179 | DISPLAY APPARATUS AND CONTROL METHOD - A display apparatus and a control method capable of preventing its user from viewing an image in an improper viewing position. The display apparatus includes: an imaging unit that captures a moving image in a predetermined range with respect to an image display direction; an image analyzer that analyzes the moving image captured by the imaging unit, and calculates a position of a target that should be guided to a proper viewing position; and a display controller that causes a display unit to perform display to guide the target to the proper viewing position when the target position calculated by the image analyzer is at an improper viewing position. | 08-30-2012 |
20120218180 | DISPLAY CONTROL DEVICE AND DISPLAY CONTROL METHOD - A display control device acquires the size of a picture added in advance to picture data and an optimum viewing distance corresponding to the picture. The display control device calculates a viewing distance, in which the relationship between the size of the picture and the optimum viewing distance is equal to the relationship between the display size and the viewing distance, as a recommended viewing distance based on the size of the picture, the optimum viewing distance, and the display size of the picture. Then, the display control device notifies the calculated recommended viewing distance to a user. | 08-30-2012 |
20120218181 | CAMERA BASED SENSING IN HANDHELD, MOBILE, GAMING OR OTHER DEVICES - Method and apparatus are disclosed to enable rapid TV camera and computer based sensing in many practical applications, including, but not limited to, handheld devices, cars, and video games. Several unique forms of social video games are disclosed. | 08-30-2012 |
20120218182 | APPARATUS FOR RECOGNIZING THE POSITION OF AN INDICATING OBJECT - The present invention relates to an apparatus for recognizing the position of an indicating object. An apparatus for recognizing the position of an indicating object of the present invention comprises: first reflecting means installed along the left, right, and bottom edges of a screen so as to reflect a laser beam emitted from object-detecting means back to the object-detecting means; said object-detecting means, formed as a pair, for analyzing a change in the amount of light in the reflected laser beam over time, and detecting position coordinates of the indicating object on the planar surface of the screen; and fixing means including a housing and a fixing member fixedly installed on an upper portion of the screen and coupled to the housing so as to fix the housing to the upper portion of the screen, the fixing means being intended for facilitating the installation of the object-detecting means on the upper portion of the screen. The apparatus of the present invention is an apparatus for recognizing the position of an indicating object that contacts a screen, wherein the apparatus is easy to transport and store, can protect the object-detecting means from dust and impurities, can easily be installed by a layperson having no expert knowledge, and can be installed without any restrictions in terms of screen size. | 08-30-2012 |
20120223878 | Information Terminal and Portable Information Terminal - This information terminal includes a control portion displaying a subsequent input item to an input item into which information has been input in a selected state where a user can input information on a display portion on the basis of a prescribed order of a plurality of input items when the user inputs the information into the input item in the selected state where information can be input, of the plurality of input items. | 09-06-2012 |
20120223879 | Autostereoscopic Display and Method for Operating the Same - An autostereoscopic display is described, which comprises a display panel, a parallax barrier that is arranged on a viewing side of the display panel, wherein the parallax barrier comprises a first e-ink plane having a plurality of e-ink particles that are pivotable around a tilt axis by a tilting angle and wherein the e-ink particles are light-transmissive in a first direction and opaque in a second direction, and a control unit that is configured to tilt the e-ink particles so as to set the e-ink particles to a common tilting angle and to synchronize a reproduction of a picture by the display panel and the tilt of the e-ink particles so as to provide a stereoscopic view to a user. | 09-06-2012 |
20120223880 | METHOD AND APPARATUS FOR PRODUCING A DYNAMIC HAPTIC EFFECT - A system that produces a dynamic haptic effect and generates a drive signal that includes two or more gesture signals. The haptic effect is modified dynamically based on the gesture signals. The haptic effect may optionally be modified dynamically by using the gesture signals and two or more real or virtual device sensor signals such as from an accelerometer or gyroscope, or by signals created from processing data such as still images, video or sound. | 09-06-2012 |
20120223881 | DISPLAY DEVICE, DISPLAY CONTROL CIRCUIT, AND DISPLAY CONTROL METHOD - In a display control circuit | 09-06-2012 |
20120229370 | System, Method and Device for Presenting Different Functional Displays When Orientation of the Device Changes - Different functional views for a mobile device are provided depending on orientation of the device. The mobile device includes an enclosure and a display disposed within the enclosure, wherein the display presents a functional view to a user when the device is positioned in a first orientation and a second functional view when the display is rotated to a second orientation. | 09-13-2012 |
20120229371 | Screen Rotation Lock Methods and Systems - Screen rotation lock methods and systems are provided. First, an angle between a specific plane of an electronic device and an absolute horizontal plane is detected using at least one sensor, and it is determined whether the angle equals to a specific angle. When the angle equals to the specific angle, a screen auto-rotation function of the electronic device is disabled. | 09-13-2012 |
20120229372 | STORAGE MEDIUM HAVING INFORMATION PROCESSING PROGRAM STORED THEREON, INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD - Data which is output from an input device capable of allowing a body of a user to get thereon and is based on a load applied on the input device is usable. Data is acquired from the input device is processed. A center-of-gravity position of a load applied on the input device is repeatedly acquired based on data output from the input device. A user direction a stepping action made on the input device is calculated using the center-of-gravity position. Predetermined processing is performed based on the user direction. | 09-13-2012 |
20120229373 | IMAGE DISPLAY CONTROL APPARATUS INCLUDING IMAGE SHOOTING UNIT - An image display control apparatus includes a CPU performing face recognition within an image shot by a camera and detecting a gaze direction from a recognized face. The CPU then changes the image to a processed image at a predetermined rate in the case where at least one of the gazes of detected faces is directed to the displayed image. | 09-13-2012 |
20120229374 | ELECTRONIC DEVICE - An electronic device and methods are disclosed. The electronic device comprises a plurality of display screens, an operation module, and a control module. The operation module is operable to be operated by a user, and the control module is electrically coupled to the display screens and the operation module, and comprises an operation display module. The operation display module is operable to display an active application from among a plurality of applications running on a first display screen from among the display screens. The operation display module is further operable to display operable information related to the active application on a second display screen, when the operation module is operated by the user. | 09-13-2012 |
20120229375 | DISPLAY DEVICE, DISPLAY METHOD AND RECORDING MEDIUM - For a book data display device of the present invention, the user performs an operation similar to an action performed on a real book when searching for a supplement held between pages of the book. At this time, in the book data display device of the present invention, a movement detection section detects the movement of the display device, and a display control section displays the image of a supplement in a display section on the basis of the detection result of the movement detection section. | 09-13-2012 |
20120229376 | INPUT DEVICE - An input device displays a screen including character input buttons to each of which characters are assigned, and each of which specifies one character assigned thereto, as a character to be inputted, according to the number of times that a selection operation is performed thereon, and display area B in which a character to be inputted specified through a selection operation performed on the character input buttons is displayed in order on a display device, specifies candidate characters each of which can be inputted to follow a character string being inputted and displayed in the display area B from the character strings of words each including the character string being inputted, and, when a character to be inputted which is specified to follow the character string is not included in the candidate characters, tones down and displays the character to be inputted immediately after the character string. | 09-13-2012 |
20120235892 | TOUCHLESS INTERACTIVE DISPLAY SYSTEM - A touchless interactive display system includes a display with a display area bounding the display. A reflective surface is located along an edge of the display. One optical sensor opposes and faces the reflective surface so that the optical sensor has a primary, non-reflected field of view and a secondary, reflected field of view that is reflected back from the reflective surface. The primary field of view covers a first portion of the display area that is less than the whole display area, and the reflected field of view covers a second portion of the display area, such that the first and second portions of the display area cover the entire display area. The optical sensor and a processor are operable to detect an object placed within at least one of its first and second fields of view without having the object touch the display. | 09-20-2012 |
20120235893 | SYSTEM AND METHOD FOR BENDABLE DISPLAY - Systems, methods and apparatus are described for displaying visual information on a deformable display device, and for compensating for distortion of images of the visual information that results from the deformation of the display device and the viewing orientation of a viewer of the display device, or for improving or enhancing the displayed visual information in response to the deformation of the display device and the viewing position of the viewer. | 09-20-2012 |
20120235894 | SYSTEM AND METHOD FOR FOLDABLE DISPLAY - As described herein, there is provided methods, apparatus and computer program products to display visual information on a foldable display device. Display control signals are altered or modified to avoid display of or compensate for impairment in the display of visual information on fold deformations introduced in a display unit of the display device in response to folding of the display unit, or to reverse any alterations or modifications in the event that the fold deformations are eliminated by unfolding of the display unit. | 09-20-2012 |
20120235895 | CONTENT OUTPUT CONTROL DEVICE AND CONTENT OUTPUT CONTROL METHOD - When a content output apparatus ( | 09-20-2012 |
20120235896 | BLUETOOTH OR OTHER WIRELESS INTERFACE WITH POWER MANAGEMENT FOR HEAD MOUNTED DISPLAY - A Head Mounted Display (HMD) system that includes a wireless front end that interprets spoken commands and/or hand motions and/or body gestures to selectively activate subsystem components only as needed to carry out specific commands. | 09-20-2012 |
20120235897 | INFORMATION PROCESSING APPARATUS, AND CONTROL METHOD AND PROGRAM THEREFOR - One of the aspects of the disclosure is directed to displaying an image according to attribute information thereof in a display area having a time axis based on an item of predetermined attribute information, and when changing a display range on the time axis, allowing a user to easily designate a point to be a reference thereof. An information processing apparatus according to the present invention displays an image in the display area having the time axis according to date and time information of the image. The information processing apparatus moves a mouse cursor on the display area according to a user's operation, and sets a reference after the display range is changed. | 09-20-2012 |
20120235898 | ELECTRONIC DEVICE AND COMMUNICATION DEVICE - There is provided an electronic device which includes a display device, a case configured to retain the display device, a stopper disposed between the retained display device and the case, and a component arranged in a space formed by disposing the stopper. Further, there is provided an electronic device which includes a display device, a case including a first space for retaining the display device, the display device being operable to be slid into the first space, a stopper disposed in a second space between the retained display device and the case, and an electronic device arranged in a third space formed by arranging the stopper. | 09-20-2012 |
20120235899 | APPARATUS, SYSTEM, AND METHOD FOR CONTROLLING VIRTUAL OBJECT - An apparatus, system, and method for controlling a virtual object. The virtual object is controlled by detecting a hand motion of a user and generating an event corresponding to the hand motion. Accordingly, the user may control the virtual object displayed on a 3-dimensional graphic user interface (3D GUI) more intuitively and efficiently. | 09-20-2012 |
20120235900 | SEE-THROUGH NEAR-EYE DISPLAY GLASSES WITH A FAST RESPONSE PHOTOCHROMIC FILM SYSTEM FOR QUICK TRANSITION FROM DARK TO CLEAR - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content wherein the optical assembly comprises a photochromic layer and a heater layer disposed on a see-through lens of the optical assembly, wherein the photochromic layer is heated by the heater layer to accelerate its transition from dark to clear. | 09-20-2012 |
20120235901 | SYSTEM AND METHOD FOR CONTROL BASED ON FACE OR HAND GESTURE DETECTION - System and method for control using face detection or hand gesture detection algorithms in a captured image. Based on the existence of a detected human face or a hand gesture in an image captured by a digital camera, a control signal is generated and provided to a device. The control may provide power or disconnect power supply to the device (or part of the device circuits). The location of the detected face in the image may be used to rotate a display screen to achieve a better line of sight with a viewing person. The difference between the location of the detected face and an optimum is the error to be corrected by rotating the display to the required angular position. A hand gesture detection can be used as a replacement to a remote control for the controlled unit, such as a television set. | 09-20-2012 |
20120235902 | CONTROL SYSTEMS AND METHODS FOR HEAD-MOUNTED INFORMATION SYSTEMS - A head-mounted information system is provided, the head-mounted information system comprising a frame configured to be mounted on a head of a user, a display unit coupled to the frame, a sensor unit coupled to the frame comprising one or more motion sensors, and, a processor unit coupled to the frame and connected to receive signals from the motion sensors. The processor unit comprises a processor and a memory accessible by the processor. The processor unit is configured to monitor the received signals and enter a gesture control mode upon detection of a gesture control enable signal. In the gesture control mode the processor is configured to convert signals received from the motion sensors into menu navigation commands. | 09-20-2012 |
20120242566 | Vision-Based User Interface and Related Method - A vision-based user interface includes an image input unit for capturing frame images, an image processor for recognizing a posture in at least one of the captured frame images, and generating a recognized gesture according to the posture, and a control unit for generating a control command corresponding to the recognized gesture. | 09-27-2012 |
20120242567 | HAND-HELD DISPLAYING DEVICE - The hand-held displaying device comprises a displaying panel to display at least one color and its brightness, a motion sensor to detect motions of the motion sensor and a controller to control contents of displays in the displaying panel according to motions detected by the motion sensor. The displaying panel may include a plurality of displaying regions, each being able to display at least one color and its brightness and respectively controlled by the controller. The displaying device may provide a sound generating element and a loudspeaker. The controller may provide a memory space to record a series of displayed to be played back in a later time. | 09-27-2012 |
20120242568 | 3-Dimensional Displaying Apparatus And Driving Method Thereof - A 3-dimensional displaying apparatus includes an image displaying panel having a plurality of pixels and a backlight panel spaced apart from one surface of the image displaying panel. The backlight panel includes a first line source set having a plurality of line sources arranged at regular intervals and a second line source set having line sources arranged spaced apart from the respective line sources of the first line source set by a predetermined interval. The first line source set and the second line source set are driven alternately. Thus, in a case where a horizontal location of an observer varies, the change of brightness of image information and the crosstalk between adjacent visual fields are minimized, and pseudo-stereoscopic vision is prevented. Also, the irregularity of brightness distribution in a visual field may be solved | 09-27-2012 |
20120242569 | DISPLAY - A display capable of performing optimum stereoscopic display according to a view position is provided. A display includes: a display section including a plurality of first pixels to a plurality of nth pixels, where n is an integer of 4 or more, and displaying a plurality of perspective images assigned to the first to nth pixels; a detection section detecting a view position of a viewer; and a display control section varying the number of the plurality of perspective images assigned to the first to nth pixels and varying a correspondence relationship between the first to nth pixels and the perspective images, according to the view position of the viewer. | 09-27-2012 |
20120242570 | DEVICE, HEAD MOUNTED DISPLAY, CONTROL METHOD OF DEVICE AND CONTROL METHOD OF HEAD MOUNTED DISPLAY - A device includes a detection unit that detects states of eyelids of a user, and a control unit that performs operations in response to the states of the eyelids of the user detected by the detection unit. | 09-27-2012 |
20120242571 | Data Manipulation Transmission Apparatus, Data Manipulation Transmission Method, and Data Manipulation Transmission Program - A data manipulation transmission apparatus including: an object acquisition section for acquiring a display object, generated on display screen | 09-27-2012 |
20120242572 | SYSTEM AND METHOD FOR TRANSACTION OF SENSORY INFORMATION - A system and method for transaction of sensory information are provided. A sensory effect extraction apparatus of the system includes a sensory effect extraction unit to extract a sensory effect from an image content in accordance with a sensory effect extraction signal, and a sensory information transmission unit to transmit sensory information based on the extracted sensory effect. | 09-27-2012 |
20120242573 | Vector-Specific Haptic Feedback - In one or more embodiments, vector-specific movement can be imparted to a user interface device (UID) to provide vector-specific haptic feedback. In at least some embodiments, this vectored movement can be based on input received by the UID. The input can include information associated with the user's interaction with an associated device integrated with or communicatively linked with the UID, and or with an application implemented on the associated device. In at least some embodiments, the UID can be configured with a controller, a microprocessor(s), and a vector-specific actuator that includes an electrically-deformable material. | 09-27-2012 |
20120249408 | OPTO-ELECTRONIC DISPLAY ASSEMBLY - An electronic display arrangement for taking light signals forming an image emitted from a miniature screen ( | 10-04-2012 |
20120249409 | METHOD AND APPARATUS FOR PROVIDING USER INTERFACES - An apparatus, method, and computer program product are provided for generating a projected user interface. The apparatus may include at least one processor and at least one memory including computer program code. The at least one memory and the computer program code may be configured, with the processor, to cause the apparatus to receive information regarding a detected position of the user's body and to determine whether the detected position is an activation position, in which case the projection of a user interface may be provided. The user interface may be projected on an area on the user's body, such as a hand or a forearm, or on the surface of an object. The activation position may thus be a predefined position of the user's body in which effective projection of the user interface onto the surface and user interaction with the user interface is facilitated. | 10-04-2012 |
20120249410 | PROJECTION UNIT AND METHOD OF CONTROLLING THE SAME - A projection unit comprises a first light source outputting light that is used to project an image during normal projection unit use, and a second light source outputting light of a different intensity that is used to project an image outside of normal projection unit use. | 10-04-2012 |
20120249411 | SCREEN PROTECTION SYSTEM AND METHOD OF AN ELECTRONIC DEVICE - In a screen protection system and method, a first operation temperature of an electronic device is detected from each of one or more first temperature sensors, and a first ambient temperature is detected from a second temperature sensor, when a display screen is activated. Once there is no operation on the electronic device, a timer is started timing. Once the electronic device is being operated, a duration is temporarily stored and the timer is reset. If the duration is equal to a screensaver time, a second operation temperature of the electronic device is detected from each of the first temperature sensors, and a second ambient temperature is detected from a second temperature sensor. The method further determines whether the electronic device is currently being held by a hand of a user, according to the above-mentioned temperatures. If the electronic device is not being held by a hand of the user, the display screen is controlled to be in an inactive state. | 10-04-2012 |
20120249412 | SUSPENDED INPUT SYSTEM - Provided herein are input devices, systems, and methods. Some of the embodiments provided herein employ magnetic levitation of a controller of the input system so as to allow various benefits to a user's experience of the input system | 10-04-2012 |
20120249413 | INPUT DEVICE AND IMAGE DISPLAY APPARATUS - An input device including: an operation device including: a flexible base member; a first detector configured to detect that the base member is being bent; and a second detector configured to detect that the base member is being nipped; and an output section connected to the first detector and the second detector, the output section being configured to output a first signal when the first detector has detected that the base member is being bent and configured to output a second signal when the second detector has detected that the base member is being nipped. | 10-04-2012 |
20120249414 | Marking one or more items in response to determining device transfer - A computationally implemented method includes, but is not limited to: determining that a computing device that was presenting one or more portions of one or more items and that was in possession of a first user has been transferred from the first user to a second user; and marking, in response to said determining, the one or more portions of the one or more items to facilitate the computing device in returning to the one or more portions upon the computing device being at least transferred back to the first user. In addition to the foregoing, other method aspects are described in the claims, drawings, and text forming a part of the present disclosure. | 10-04-2012 |
20120249415 | SERVER, TERMINAL DEVICE, AND GROUPING METHOD - A server, which is connected to a plurality of terminal devices, includes a device ID (identification) storage unit configured to store device ID items for identifying the terminal devices; an acquiring unit configured to acquire coordinate information that is input at the terminal devices identified by the device ID items and time information that is input; a positional relationship determining unit configured to determine positional relationships between the terminal devices according to a sequence of coordinates that is input in a manner to cross over the terminal devices, based on the coordinate information and the time information; and a group determining unit configured to extract, from determination results of the positional relationship determining unit, the device ID items identifying the terminal devices that are to be grouped together into a group. | 10-04-2012 |
20120249416 | MODULAR MOBILE CONNECTED PICO PROJECTORS FOR A LOCAL MULTI-USER COLLABORATION - The various embodiments include systems and methods for rendering images in a virtual or augmented reality system that may include capturing scene images of a scene in a vicinity of a first and a second projector, capturing spatial data with a sensor array in the vicinity of the first and second projectors, analyzing captured scene images to recognize body parts, and projecting images from each of the first and the second projectors with a shape and orientation determined based on the recognized body parts. Additional rendering operations may include tracking movements of the recognized body parts, applying a detection algorithm to the tracked movements to detect a predetermined gesture, applying a command corresponding to the detected predetermined gesture, and updating the projected images in response to the applied command. | 10-04-2012 |
20120249417 | INPUT APPARATUS - An input apparatus includes a gesture sensing unit for sensing a hand gesture of a user; and an input signal generation unit for generating an input signal for control of a target electronic device based on the sensed hand gesture. The hand gesture of the user includes at least one of a pinching gesture with two or more fingers, a pointing gesture with one finger, and a scratching gesture with one finger to a camera. | 10-04-2012 |
20120249418 | OPTICAL POSITION DETECTION DEVICE, OPTICAL POSITION DETECTION SYSTEM, AND DISPLAY SYSTEM WITH INPUT FUNCTION - In an optical position detection device, the XY coordinates of a target object are detected based on received light intensity of a light receiving section by forming a light intensity distribution, in which the intensity changes in a radiation angle range of detection light, with first and second light source modules. The first and second light source modules are separated from each other in the Z-axis direction, and the position of the target object in the Z-axis direction is detected based on the received light intensity of the light receiving section when forming the light intensity distribution in which the intensity is fixed in the radiation angle range of detection light. | 10-04-2012 |
20120256820 | Methods and Systems for Ergonomic Feedback Using an Image Analysis Module - A display device can be used with an ergonomic sensor comprising an imaging device interfaced to processing hardware to obtain and analyze image data depicting a user of the display device. The ergonomic sensor can be preconfigured with data indicating ergonomic uses of the display device so that the image of the user can be analyzed with minimal or no user calibration or setup. Instead, the ergonomic sensor can analyze the image data to provide real-time feedback, such as warnings or suggestions when the user's behavior falls outside an ergonomic use range for the display device. In some implementations, the ergonomic sensor is integrated with the display device, though in other implementations a separate element or preexisting imaging device can be used. | 10-11-2012 |
20120256821 | USER INTERFACE DEVICES, APPARATUS, AND METHODS - User interface devices using magnetic sensing to provide output signals associated with motion and/or deformation of an actuator element of the interface devices are described. The output signals may be provided to an electronic computing system to provide commands, controls, and/or other data or information. In one embodiment, a user interface device may include a plurality of permanent magnets and a plurality of multi-axis magnetic sensors to generate motion and/or deformation signals to be provided to a processing element to generate the output signals. | 10-11-2012 |
20120256822 | LEARNER RESPONSE SYSTEM - There is provided a method, in a collaborative input system comprising a computer and a plurality of user terminals, in which system the user terminals are adapted to communicate with the computer system, the method comprising: defining one or more user groups; and allocating one or more of the plurality of user terminals to the one or more user groups. | 10-11-2012 |
20120256823 | TRANSPARENT DISPLAY APPARATUS AND METHOD FOR OPERATING THE SAME - A transparent display apparatus and a method for operating the same are disclosed. The method may include detecting an object located proximate to the transparent image display apparatus; determining a position of the detected object relative to the transparent image display apparatus; and selecting, from among multiple, different augmented object displays associated with the detected object, an augmented object display based on the determined position of the detected object relative to the transparent image display apparatus. A first augmented object display may be selected based on the determined position being a first position relative to the transparent image display apparatus, and a second augmented object display may be selected based on the determined position being a second position relative to the transparent image display apparatus. Display of the selected augmented object display is controlled on the transparent image display apparatus. | 10-11-2012 |
20120256824 | PROJECTION DEVICE, PROJECTION METHOD AND PROJECTION PROGRAM - According to an illustrative embodiment, an information processing apparatus is provided. The apparatus is used for processing a first image projected toward a target. The apparatus includes a processing unit for detecting that an object exists between a projector unit and the target, wherein when an object exists between the projector unit and the target, the apparatus determines an area of the object and generates a modified first image, based on the area of the object, for projection toward the target. | 10-11-2012 |
20120256825 | OPTICAL POSITION DETECTION DEVICE, LIGHT RECEIVING UNIT, AND DISPLAY SYSTEM WITH INPUT FUNCTION - An optical position detection device includes a light receiving section receiving detection light reflected from a target object located in a detectable space through which detection light is radially emitted along an XY plane. The light receiving section includes a light receiving element and a concave mirror. A first cross section (XY cross section) of the reflective surface of the concave mirror is an arc, and a second cross section (YZ cross section) perpendicular to the first cross section is a quadratic curve. Therefore, in the in-plane direction of the XY plane, even light incident from an oblique direction with respect to the light receiving section is reflected by the concave mirror to the light receiving element. In the in-plane direction of the YZ plane, however, the range where the light reaches the light receiving element is limited via the concave mirror. | 10-11-2012 |
20120256826 | PORTABLE TELEPHONE - In a portable telephone according to the present invention, a display displays a block indicative of an operator, predetermined information and a pointer; the operator can be operated in directions opposite to each other; and the controller controls the display so as to shift the pointer to a desirable position within a predetermined information on a screen of the display in accordance with an operation of the operator and also display a mark indicative of a direction to which the pointer can be shifted and in which the predetermined information exists, adjacently to the block along a shift direction through the operator. | 10-11-2012 |
20120262365 | OBJECT TRACKING WITH PROJECTED REFERENCE PATTERNS - Systems and methods for tracking an object's position and orientation within a room using patterns projected onto an interior surface of the room, such as the ceiling. Systems include at least one optical position sensitive device embedded in the object to detect relative changes in the projected pattern as the object's position and/or orientation is changed. In particular systems, the pattern includes a plurality of beacons projected by one or more steerable lasers. Projected beacons may be steered automatically to accommodate a variety of room topologies. Additional optical position sensitive devices may be disposed in known physical positions relative to the projected pattern to view either or both the projected pattern and the object. A subset of object positional data may be derived from a video camera viewing the object while another subset of object positional data is derived from the projected pattern. | 10-18-2012 |
20120262366 | ELECTRONIC SYSTEMS WITH TOUCH FREE INPUT DEVICES AND ASSOCIATED METHODS - Embodiments of electronic systems, devices, and associated methods of operation are described herein. In one embodiment, a computing system includes an input module configured to acquire images of an input device from a camera, the input device having a plurality of markers. The computing system also includes a sensing module configured to identify segments in the individual acquired images corresponding to the markers. The computing system further includes a calculation module configured to form a temporal trajectory of the input device based on the identified segments and an analysis module configured to correlate the formed temporal trajectory with a computing command. | 10-18-2012 |
20120262367 | FLEXIBLE ELECTRONIC DEVICE - A flexible electronic device can be operated in multiple operation modes. The flexible electronic device mainly includes a substrate, a flexible display module, at least one folding device, a button module, and a processing unit. The flexible display module is located above the substrate, and the at least one folding device is used for folding the display. When the display is unfolded, a main display is launched on the display. When the display is folded, the main display can be divided into a plurality of sub-display zones such that images or graphics can be displayed on one of the sub-display zones. The button module is located on one of the sub-display zones. The processing unit is electronically connected to the button module in order to operate the flexible electronic device. | 10-18-2012 |
20120262368 | INFORMATION PROCESSING APPARATUS, CONTROL METHOD OF INFORMATION PROCESSING APPARATUS, AND PROGRAM - It enables a user to easily confirm whether or not a process corresponding to a key was correctly performed. To do so, there is provided a control method for controlling an information processing apparatus, comprising: controlling, in a case where, after a key in a screen displayed on a display unit was depressed, the key comes to be not depressed inside a display area of the key, to perform a process corresponding to the key, and, in a case where, after the key in the screen displayed on the display unit was depressed, the key comes to be not depressed outside the display area of the key, to not perform the process corresponding to the key; and notifying, in the case where the key comes to be not depressed outside the display area of the key, the user that the process corresponding to the key is not performed. | 10-18-2012 |
20120268359 | CONTROL OF ELECTRONIC DEVICE USING NERVE ANALYSIS - An electronic device may be controlled using nerve analysis by measuring a nerve activity level for one or more body parts of a user of the device using one or more nerve sensors associated with the electronic device. A relationship can be determined between the user's one or more body parts and an intended interaction by the user with one or more components of the electronic device using each nerve activity level determined. A control input or reduced set of likely actions can be established for the electronic device based on the relationship determined. | 10-25-2012 |
20120268360 | User Identified to a Controller - Methods, systems, and computer programs for configuring a computer program based on user are provided. One method includes an operation for detecting, by a controller, an object carried by a user, where the object includes a parameter value—e.g., a radio-frequency identification (RFID) tag—that uniquely identifies the object from a plurality of objects. The parameter value is transmitted to a computing device executing the computer program, and the computer program determines if the computer program has user information associated with the transmitted parameter value. The computer program is configured utilizing the user information, when the computer program has the user information for the parameter value. | 10-25-2012 |
20120268361 | HAND-HELD ELECTRONIC DEVICE AND OPERATION METHOD APPLICABLE THERETO - An operation method, applicable to a hand-held electronic device having a display unit and a social networking share hardware button, includes: detecting whether the social networking share hardware button is triggered; and in response that the social networking share hardware button is triggered, posting a user share content on a social networking based on a content displayed on the display unit. | 10-25-2012 |
20120268362 | LASER DIODE MODES - Laser diode mode techniques are described. In one or more implementations, one or more laser diodes of a computing device are caused to operate below a lasing threshold to illuminate at least part of a physical surroundings of the computing device. One or more images of the illuminated physical surroundings are captured by a camera of the computing device and one or more inputs are recognized from the captured one or more images for interaction with a user interface displayed by the computing device. | 10-25-2012 |
20120268363 | IMAGE PROCESSING APPARATUS, IMAGE PROCESSING SYSTEM, IMAGE PROCESSING METHOD, AND COMPUTER READABLE MEDIUM - An image processing apparatus includes an imaged image acquiring unit, an identifying unit, an acquiring unit, and a display control unit. The imaged image acquiring unit acquires an imaged image. The identifying unit identifies an object imaged in the imaged image. The acquiring unit acquires image information on the object stored in association with the object identified by the identifying unit. The display control unit displays at least part of an image area of an object image based on the image information on the object. | 10-25-2012 |
20120268364 | FAST FINGERTIP DETECTION FOR INITIALIZING A VISION-BASED HAND TRACKER - Systems and methods for initializing real-time, vision-based hand tracking systems are described. The systems and methods for initializing the vision-based hand tracking systems image a body and receive gesture data that is absolute three-space data of an instantaneous state of the body at a point in time and space, and at least one of determine an orientation of the body using an appendage of the body and track the body using at least one of the orientation and the gesture data. | 10-25-2012 |
20120268365 | METHOD, SYSTEM, AND APPARATUS FOR CONTROLLING LIGHT - Methods, systems and apparatuses for controlling light. A method of controlling light includes displaying on a display unit at least one light property representing a property of the light of a lighting apparatus; if a user command for selecting a property value of the light property is input, displaying a lighting state corresponding to the selected property value; and if a user command representing that selection is completed is input, determining the displayed lighting state as a lighting state of the lighting apparatus. | 10-25-2012 |
20120268366 | Method and Device for Visual Compensation | 10-25-2012 |
20120268367 | Method and Apparatus for Communication Between Humans and Devices - This invention relates to methods and apparatus for improving communications between humans and devices. The invention provides a method of modulating operation of a device, comprising: providing an attentive user interface for obtaining information about an attentive state of a user; and modulating operation of a device on the basis of the obtained information, wherein the operation that is modulated is initiated by the device. Preferably, the information about the user's attentive state is eye contact of the user with the device that is sensed by the attentive user interface. | 10-25-2012 |
20120268368 | Method, Apparatus, and Article for Force Feedback Based on Tension Control and Tracking Through Cables - A haptic interface system includes a cable based haptic interface device and a controller. The controller receives information related to movement of a grip in real-space and generates a stereoscopic output for a display device. The stereoscopic output includes images of a virtual reality tool whose motions mimic motions of the real-space grip. | 10-25-2012 |
20120274545 | PORTABLE ELECTRONIC DEVICE AND METHOD OF CONTROLLING SAME - A method includes displaying a projected image from a display of a portable electronic device, detecting an object near the portable electronic device, and when the object is associated with the projected image, actuating an actuator to provide tactile feedback. | 11-01-2012 |
20120274546 | Data input glove with instantaneous chord detection - The invention describes a data glove input device that relies on a novel chording mechanism. The device is comprised of one or two gloves with conductive elements covering the finger tips and additional conductive elements on the palm and the thumb. The conductive elements are divided into two groups of opposite polarity. A chord is formed by when two or more conductive elements of the same polarity are held in contact with each other. The device generates an output when a conductive element or a chord of one polarity makes or breaks contact with a conductive element or chord of the opposite polarity. The innovation lies with the large number of key combinations supported using this chording mechanism in an easily accessible manner. | 11-01-2012 |
20120274547 | TECHNIQUES FOR CONTENT NAVIGATION USING PROXIMITY SENSING - Techniques for content navigation utilize proximity sensing so that user interaction with a graphical user interface is based at least in part on both contact with a surface and contactless interaction with the surface. A representation of an object detected as being proximate to and/or in contact with a surface appears on a display, which may be separate from the surface. The representation may appear at a location of the display that is determined according to a mapping of surface locations to display locations. The representation is updated based at least in part on movement of the object relative to the surface and one or more distances of the object from the surface. | 11-01-2012 |
20120274548 | PLIABLE FINGERTIP KEY DEPRESSOR FOR USE WITH SMALL KEYBOARDS - An apparatus for depressing keys on a small keyboard is disclosed. The apparatus includes a flexible substrate including a first surface, a second surface opposite the first surface, a first peripheral end portion, and a second peripheral end portion opposite the first end portion. An adhesive is coupled to the first surface of the flexible substrate such that the substrate may be attached to a digit on the user's hand, e.g., the user's thumb. When the apparatus is attached to a user's thumb, the first peripheral end portion does not touch the second peripheral end portion. Moreover, the apparatus includes a nodule that extends outwardly from the second surface of the flexible substrate such that when the substrate is attached to the user's thumb the nodule is outwardly positioned on the user's thumb, wherein the nodule is smaller than the tip of the user's thumb. | 11-01-2012 |
20120274549 | METHOD AND DEVICE FOR PROVIDING A USER INTERFACE IN A VEHICLE - In a method for providing a user interface in a vehicle, control objects and/or display objects can be displayed on a display surface ( | 11-01-2012 |
20120274550 | GESTURE MAPPING FOR DISPLAY DEVICE - Embodiments of the present invention disclose a gesture mapping method for a computer system including a display and a database coupled to a processor. According to one embodiment, the method includes storing a plurality of two-dimensional gestures for operating the computer system, and detecting the presence of an object within a field of view of at least two three-dimensional optical sensors. Positional information is associated with movement of the object, and this information is mapped to one of the plurality of gestures stored in the database. Furthermore, the processor is configured to determine a control operation for the mapped gesture based on the positional information and a location of the object with respect to the display. | 11-01-2012 |
20120274551 | ELECTRONIC DEVICE, SCREEN CONTROL METHOD, AND STORAGE MEDIUM STORING SCREEN CONTROL PROGRAM - According to an aspect, an electronic device includes a first display unit, a second display unit, a detecting unit, and a control unit. The first display unit displays a first object corresponding to a first function. The second display unit displays a second object corresponding to a second function. The detecting unit detects an operation. When the operation is detected by the detecting unit while the first object is displayed on the first display unit, the control unit dismisses the first object from the first display unit and displays information with respect to the first object on the second display unit. | 11-01-2012 |
20120274552 | INFORMATION DISPLAY SYSTEM, INFORMATION DISPLAY APPARATUS, INFORMATION DISPLAY METHOD, INFORMATION DISPLAY PROGRAM, INFORMATION PROVIDING APPARATUS, AND RECORDING MEDIUM - The present invention includes: receiving, from a terminal, display position information for specifying a display area | 11-01-2012 |
20120274553 | DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A method of controlling a display apparatus includes: displaying a pointer corresponding motion of a user, in which the user's motion is analyzed on the basis of images generated by sensing the user; and generating an input signal preset corresponding to the analyzed motion of the user. With this configuration, a user can input a command without using an additional physical device or being subject to spatial limitations. | 11-01-2012 |
20120274554 | BODY MOVEMENT DETECTION DEVICE AND DISPLAY CONTROL METHOD OF BODY MOVEMENT DETECTION DEVICE - A body movement detection device is provided with a main body unit, a display unit provided to the main body unit, a control unit, and a detection unit that detects acceleration of the main body unit, and the control unit includes a discriminating unit for discriminating a movement state of a user wearing the main body unit based on the acceleration detected by the detection unit, and a display control unit for switching a display state of the display unit based on discrimination of the movement state by the discriminating unit. Display can thereby be automatically switched to display appropriate to the state of physical activity. | 11-01-2012 |
20120274555 | TACTILE USER INTERFACE - An apparatus configured as a virtual user interface and including: a display screen; a masking element for concealing at least part of said screen and revealing at least one preselected display area; at least one user-actuated control element; and a controller responsive to said user-actuated control element and configured to display information on said display area operatively associated with said user-actuated control element. | 11-01-2012 |
20120274556 | DISPLAY APPARATUS AND DISPLAY METHOD - A display apparatus comprising a display panel including first and second display regions arranged in an alternating manner and respectively passing a first polarization component light and a second polarization component of incident light; a light source panel including first and second light sources arranged in an alternating manner and respectively emitting light of the first polarization component and light of the second polarization component to the back surface of the display panel; and a lenticular lens that is positioned between the display panel and the light source panel, refracts light from the first light sources in a direction of a first viewpoint to provide the first viewpoint with the light passed by the first display regions, and refracts light from the second light sources in a direction of a second viewpoint to provide the second viewpoint with the light passed by the second display regions. | 11-01-2012 |
20120274557 | Computer System With Digital Micromirror Device - Algorithms stored on one or more computer readable medium for interfacing an imaging display with an electronic-ink generating system are described. The algorithms includes instructions for capturing an electronic-ink image, instructions for converting the electronic ink image into control instructions for controlling an electromechanical aspect of the imaging display, and, instructions for providing the control instructions to control circuitry of the imaging display. Interfacing projection systems and methods of interfacing electronic-ink images are also described. | 11-01-2012 |
20120280897 | Attribute State Classification - Attribute state classification techniques are described. In one or more implementations, one or more pixels of an image are classified by a computing device as having one or several states for one or more attributes that do not identify corresponding body parts of a user. A gesture is recognized by the computing device that is operable to initiate one or more operations of the computing device based at least in part of the state classifications of the one or more pixels of one or more attributes. | 11-08-2012 |
20120280898 | METHOD, APPARATUS AND COMPUTER PROGRAM PRODUCT FOR CONTROLLING INFORMATION DETAIL IN A MULTI-DEVICE ENVIRONMENT - A method is provided for controlling information detail in a multi-device environment. In particular, example methods may provide for operating a device in a multi-device environment, directing the presentation, on a display of the device, of a first image, detecting a motion of the device, directing a change of the image presented on the display of the device from the first image to a second image in response to detecting the motion of the device. The first image presented on the device is related to images displayed on other devices in the multi-device environment. The second image may be a scaled version of the first image and the second image may be scaled based on at least one property of the motion. Each device in the multi-device environment may be directed to display a portion of a complete image, where the first image is a portion of the complete image. | 11-08-2012 |
20120280899 | METHODS AND APPARATUSES FOR DEFINING THE ACTIVE CHANNEL IN A STEREOSCOPIC VIEW BY USING EYE TRACKING - Methods and apparatuses are provided for facilitating interaction with a three-dimensional user interface. A method may include receiving an indication of an eye movement input to an imaging sensor. The method may further include determining, by a processor, a relation of the eye movement to at least one of a displayed element of a three-dimensional user interface. The method may additionally include causing, based at least in part on the determined relation, a selection of at least one of a displayed element of the three-dimensional user image. The method may also include causing a modification of a displayed element of the at least one selected displayed element of the three-dimensional image. Corresponding apparatuses are also provided. | 11-08-2012 |
20120280900 | GESTURE RECOGNITION USING PLURAL SENSORS - Apparatus comprises a processor; a user interface enabling user interaction with one or more software applications associated with the processor; first and second sensors configured to detect, and generate signals corresponding to, objects located within respective first and second sensing zones remote from the apparatus, wherein the sensors are configured such that their respective sensing zones overlap spatially to define a third, overlapping, zone in which both the first and second sensors are able to detect a common object; and a gesture recognition system for receiving signals from the sensors, the gesture recognition system being responsive to detecting an object inside the overlapping zone to control a first user interface function in accordance with signals received from both sensors. | 11-08-2012 |
20120280901 | ENVIRONMENT-DEPENDENT DYNAMIC RANGE CONTROL FOR GESTURE RECOGNITION - Technologies are generally described for environment-dependent dynamic range control for gesture recognition. In some examples, user environment including, but not limited to, location, device size, virtual or physical display size, is detected and gesture control range adjusted according to the detected environment. In other examples, a controller user interface or dynamic range status indicator may be adjusted based on the modified gesture recognition range control. | 11-08-2012 |
20120280902 | PROXIMITY SENSOR MESH FOR MOTION CAPTURE - Apparatuses for motion capture are disclosed that includes a surface configured to support an object; and at least one sensor arranged with the surface, wherein the at least one sensor is configured to communicate with one or more remote sensors to obtain at least one of ranging or inertial information for use in a kinematic model of the object. A method for motion capture is also disclosed that includes providing a surface configured to support an object; and arranging at least one sensor with the surface, wherein the at least one sensor is configured to communicate with one or more remote sensor to obtain at least one ranging or inertial information for use in a kinematic model of the object. | 11-08-2012 |
20120280903 | Motion Sensing Display Apparatuses - Motion-sensing display apparatuses supported near a user's eye including partially transparent screens at least partially disposed within the user's field of vision, image generators positioned to display an image on a first side of the screen, motion capture devices positioned near the screen and configured to capture a user gesture occurring beyond the screen in the user's field of vision, and processors in data communication with the image generator and the motion capture device, the processors configured to execute computer executable instructions in response to the user gesture. In some examples, motion-sensing display apparatuses include cameras. In some further examples, image generators display user interfaces on screens. | 11-08-2012 |
20120280904 | METHOD FOR DETECTING GESTURES USING A MULTI-SEGMENT PHOTODIODE AND ONE OR FEWER ILLUMINATION SOURCES - A gesture sensing device includes a multiple segmented photo sensor and a control circuit for processing sensed voltages output from the sensor. The control circuit processes the sensed voltage signals to determine target motion relative to the segmented photo sensor. The control circuit includes an algorithm configured to calculate one of more differential analog signals using the sensed voltage signals output from the segmented photo sensors. A vector is determined according to the calculated differential analog signals, the vector is used to determine a direction and/or velocity of the target motion. | 11-08-2012 |
20120280905 | IDENTIFYING GESTURES USING MULTIPLE SENSORS - Systems and methods for recognizing human gestures are disclosed. In one embodiment, a method for recognizing a gesture made by an operator with a portable device, may comprise: obtaining a first sensor data profile associated with measurements made by the first sensor while the operator made a specific gesture involving the portable device; obtaining a second sensor data profile associated with measurements made by the second sensor while the operator made the specific gesture involving the portable device; and identifying the specific gesture by analyzing the first sensor data profile and the second sensor data profile. | 11-08-2012 |
20120280906 | Portable Information Processing Device - A portable information processing device ( | 11-08-2012 |
20120280907 | Portable Information Processing Device and Media Data Replay System - A portable information processing device ( | 11-08-2012 |
20120280908 | Smartphone-Based Methods and Systems - Arrangements involving portable devices (e.g., smartphones and tablet computers) are disclosed. One arrangement enables a content creator to select software with which that creator's content should be rendered—assuring continuity between artistic intention and delivery. Another utilizes a device camera to identify nearby subjects, and take actions based thereon. Others rely on near field chip (RFID) identification of objects, or on identification of audio streams (e.g., music, voice). Some technologies concern improvements to the user interfaces associated with such devices. Others involve use of these devices in connection with shopping, text entry, sign language interpretation, and vision-based discovery. Still other improvements are architectural in nature, e.g., relating to evidence-based state machines, and blackboard systems. Yet other technologies concern use of linked data in portable devices—some of which exploit GPU capabilities. Still other technologies concern computational photography. A great variety of other features and arrangements are also detailed. | 11-08-2012 |
20120287031 | PRESENCE SENSING - One embodiment may take the form of a method of operating a computing device to provide presence based functionality. The method may include operating the computing device in a reduced power state and collecting a first set of data from a first sensor. Based on the first set of data, the computing device determines if an object is within a threshold distance of the computing device and, if the object is within the threshold distance, the device activates a secondary sensor to collect a second set of data. Based on the second set of data, the device determines if the object is a person. If the object is a person, the device determines a position of the person relative to the computing device and executes a change of state in the computing device based on the position of the person relative to the computing device. If the object is not a person, the computing device remains in a reduced power state. | 11-15-2012 |
20120287032 | SLIM PROFILE MAGNETIC USER INTERFACE DEVICES - Slim profile magnetic user interface devices (slim UIDs) are disclosed. A slim UID may include a slim profile housing, a movable actuator assembly having user contact surfaces on opposite sides, along with a magnet, magnetic sensor, restoration element, and processing element. User mechanical interaction with the actuator element may be sensed by the magnetic sensor and processed to generate output signals usable by a coupled electronic computing system. | 11-15-2012 |
20120287033 | OPERATION INPUT DEVICE - An operation input device includes: an operation body having a handle portion, tilting around a rotation center point when a user tilts the operation axis line of the handle portion, and tilting in a predetermined number of tilting directions; multiple detection portions, the number of the detection portions being less than the predetermined number of tilting directions, each detection portion outputting a first output value when the operation body tilts in a direction corresponding to the detection portion and outputting a second output value when the operation body tilts in a direction not corresponding to the detection portion; and a determination device that determines a tilting direction of the operation body based on information on the number of first output values and information on a part of the detection portions that have outputted the first output values. | 11-15-2012 |
20120287034 | METHOD AND APPARATUS FOR SHARING DATA BETWEEN DIFFERENT NETWORK DEVICES - Disclosed are a user interface for a data sharing function according to network connection between network electronic devices and a user device for operating a data sharing function using same. The method for sharing data between network electronic devices, includes: searching network electronic devices located at a periphery of a user device when an input for performing a data sharing function is sensed; classifying the searched network electronic devices into transmission side network electronic devices and reception side network electronic devices; allotting the searched network electronic devices to a first region for receiving data and a second region for transmitting the data, respectively; and configuring and displaying a user interface for a data sharing function based on the network electronic devices allotted to the first region and the second regions. | 11-15-2012 |
20120287035 | Presence Sensing - One embodiment may take the form of a method of operating a computing device in a reduced power state and collecting a first set of data from at least one sensor. Based on the first set of data, the computing device determines a probability that an object is within a threshold distance of the computing device and, if so, the device activates at least one secondary sensor to collect a second set of data. Based on the second set of data, the device determines if the object is a person. If it is a person, a position of the person relative to the computing device is determined and the computing device changes its state based on the position of the person. If the object is not a person, the computing device remains in a reduced power state. | 11-15-2012 |
20120287036 | PORTABLE TERMINAL DEVICE HAVING AN ENLARGED-DISPLAY FUNCTION, METHOD FOR CONTROLLING ENLARGED DISPLAY, AND COMPUTER-READ-ENABLED RECORDING MEDIUM - The disclosed portable terminal device is provided with: a first display unit ( | 11-15-2012 |
20120287037 | LIGHT-EMITTING DEVICE, AND LIQUID CRYSTAL DISPLAY DEVICE AND IMAGE DISPLAY DEVICE THAT USE THE SAME - A light-emitting device used in a liquid crystal display device is provided with a planar illuminator that focuses emitted light on a predetermined light focus point, and an optical deflector that two dimensionally deflects the light from the planar illuminator. The planar illuminator switches an emission direction of the light to enable alternate switching between a first light focus state in which the predetermined light focus point is the position of a right eye of a viewer and a second light focus state in which the predetermined light focus point is the position of a left eye of the viewer. The optical deflector can modulate each of the predetermined light focus point in the first light focus state and the predetermined light focus point in the second light focus state according to movement of the viewer. | 11-15-2012 |
20120287038 | Body Scan - A depth image of a scene may be received, observed, or captured by a device. The depth image may then be analyzed to determine whether the depth image includes a human target. For example, the depth image may include one or more targets including a human target and non-human targets. Each of the targets may be flood filled and compared to a pattern to determine whether the target may be a human target. If one or more of the targets in the depth image includes a human target, the human target may be scanned. A skeletal model of the human target may then be generated based on the scan. | 11-15-2012 |
20120287039 | USER INTERFACE FOR APPLICATION SELECTION AND ACTION CONTROL - Example embodiments disclosed herein relate to a computing device including a processor and a machine-readable storage medium, which may include instructions for displaying a first interface area in a user interface, the first interface area including a plurality of application selection controls, each corresponding to an application accessible to the computing device. The storage medium may further include instructions for displaying a second interface area in the user interface, the second interface area including a plurality of action controls, wherein each action control is associated with a function of the application corresponding to a currently-selected application selection control. Finally, the storage medium may include instructions for displaying a third interface area in the user interface, the third interface area comprising an interface of the application corresponding to the currently-selected application selection control. Example methods and machine-readable storage media are also disclosed. | 11-15-2012 |
20120293402 | MONITORING INTERACTIONS BETWEEN TWO OR MORE OBJECTS WITHIN AN ENVIRONMENT - One or more techniques and/or systems are provided for monitoring interactions by an input object with an interactive interface projected onto an interface object. That is, an input object (e.g., a finger) and an interface object (e.g., a wall, a hand, a notepad, etc.) may be identified and tracked in real-time using depth data (e.g., depth data extracted from images captured by a depth camera). An interactive interface (e.g., a calculator, an email program, a keyboard, etc.) may be projected onto the interface object, such that the input object may be used to interact with the interactive interface. For example, the input object may be tracked to determine whether the input object is touching or hovering above the interface object and/or a projected portion of the interactive interface. If the input object is in a touch state, then a corresponding event associated with the interactive interface may be invoked. | 11-22-2012 |
20120293403 | Method for Controlling Display Device by Using Double Buttons - A method for controlling display device by using double buttons is provided. The method comprises the following steps: pressing a power button of a double buttons so as to turn on the display device when the display device is turned off; pressing a menu button of the double buttons so as to start an On Screen Display (OSD) menu on a screen of the display device when the display device is turned on; after entering the OSD menu, pressing the power button or the menu button and surfing the OSD menu; | 11-22-2012 |
20120293404 | Low Cost Embedded Touchless Gesture Sensor - An array of independently addressable optical emitters, and an array of independently addressable detectors, energized according to an optimized sequence sensing a performed gesture to generate feature vector frames that are compressed by a projection matrix and processed by a trained model to perform touchless gesture recognition. | 11-22-2012 |
20120293405 | DISPLAY DEVICE AND CONTROLLING METHOD - An image display device and controlling method capable of optimizing a state of the image display device for a user at a desired position. The display device includes: an imaging section that takes an image of a predetermined range of a dynamic image with respect to an image display direction; an image analyzing section that analyzes the dynamic image taken by the imaging section and calculates a position of a user; a system optimization processing section that calculates system control information for optimizing a system based on the position of the user calculated by the image analyzing section; and a system controlling section that optimizes the system based on the system control information calculated by the system optimization processing section. | 11-22-2012 |
20120293406 | METHOD AND APPARATUS FOR PROCESSING INPUT IN MOBILE TERMINAL - A method for processing an input in a mobile terminal that recognizes inputs such as a visual input and/or voice input is provided. In the method, a position on a screen corresponding to a viewing direction of a user's eyes is determined. An indicator corresponding to the determined position is displayed on the screen. When a relevant (i.e. actuating) signal is detected, an indicator is arranged on the desired object for selection. | 11-22-2012 |
20120293407 | HEAD MOUNTED DISPLAY DEVICE AND IMAGE DISPLAY CONTROL METHOD THEREFOR - A Head Mounted Display (HMD) device and an image display control method are disclosed. The device includes a display unit including a left display and a right display for a left eye and a right eye for displaying images for the left eye and the right eye, a vital reaction sensor unit including a first vital reaction sensor for the left eye and a second vital reaction sensor for the right eye, detecting vital reaction changes of a user viewing the left display and the right display, and generating, when a vital reaction change is detected, an interruption signal including coordinates of a position at which the vital reaction change is detected, and a control unit for outputting images for the left eye and the right eye to the display unit. | 11-22-2012 |
20120293408 | TRACKING BIMANUAL MOVEMENTS - Hands may be tracked before, during, and after occlusion, and a gesture may be recognized. Movement of two occluded hands may be tracked as a unit during an occlusion period. A type of synchronization characterizing the two occluded hands during the occlusion period may be determined based on the tracked movement of the occluded hands. Based on the determined type of synchronization, it may be determined whether directions of travel for each of the two occluded hands change during the occlusion period. Implementations may determine that a first hand and a second hand are occluded during an occlusion period, the first hand having come from a first direction and the second hand having come from a second direction. The first hand may be distinguished from the second hand after the occlusion period based on a determined type of synchronization characterizing the two hands, and a behavior of the two hands. | 11-22-2012 |
20120293409 | MOBILE DEVICE AND DISPLAY CONTROL METHOD - According to an aspect, a mobile device includes an input unit, a display unit, a storage unit, and a control unit. The input unit detects operation input. The display unit displays a standby screen. The storage unit stores a plurality of objects to be displayed on the standby screen. The objects are associated with at least one of shortcut information or character information and being allocated with group information for classifying the objects. The control unit for sets on the standby screen a plurality of divided regions to be divided on a group-by-group basis and causes the plurality of objects to be displayed in the divided regions corresponding to the respective groups to which the objects belong. | 11-22-2012 |
20120299810 | Fingertip Input Device - A Fingertip Input Device that is composed of a bead and a ring or thimble is worn around a fingertip. It is used to interact with a touch screen device and mimic the bare fingertip's actions. | 11-29-2012 |
20120299811 | TRANSFERRING RUI FROM ONE DEVICE TO ANOTHER - A method of and system for transferring a remote user interface from one device to another device is described herein. A server stores state information and uses the information to transfer the RUI and/or other data from one device to the other device. This enables a user to transition from one device to another device seamlessly and without interruption of their user interface and/or programming. | 11-29-2012 |
20120299812 | APPARATUS AND METHOD FOR CONTROLLING DATA OF EXTERNAL DEVICE IN PORTABLE TERMINAL - An apparatus and a method operate in a portable terminal to control data of an external device, for example, remotely controlling data output from the external device connected to the portable terminal without directly manipulating the portable terminal. The apparatus includes a camera for receiving an image of a hand and a controller for controlling data being output from the external device according to a gesture of the hand determined to be equivalent to a hand gesture image received through the camera while the controller is connected to and outputs data to the external device. | 11-29-2012 |
20120299813 | METHOD AND APPARATUS FOR INPUTTING USER COMMANDS USING RELATIVE MOVEMENTS OF DEVICE PANELS - A method and apparatus for inputting various operation instructions to a device including two movable panels. The method includes determining whether a relative angle between the first panel and the second panel is within an effective angle range; determining whether the relative angle within the effective angle range is maintained during an effective time; and inputting an operation instruction to the device based on whether the relative angle between the first panel and the second panel is within the effective angle range and whether the relative angle within the effective angle range is maintained during the effective time. | 11-29-2012 |
20120299814 | MOBILE TERMINAL AND MODE CONTROLLING METHOD THEREIN - A mobile terminal including a communication unit; a memory configured to store at least first and second operating systems including at least first and second modes, respectively; and a controller configured to display, in a first display region of a display unit of the mobile terminal, a first application indicator corresponding to a first application executable in the first mode using the first operating system and that can be activated by selecting the first application indicator, to display, in a second display region, a second application indicator corresponding to a second application executable in the second mode using the second operating system and that can be activated by selecting the second application indicator. Further, the first and second application indicators indicate whether the applications are executable in the first mode or the second mode, or executable in both the first and second modes. | 11-29-2012 |
20120299815 | DISPLAY DEVICE AND METHOD FOR REMOTELY CONTROLLING DISPLAY DEVICE - A display device and a method for remotely controlling the display device are disclosed. A controller executes a first application and a second application. A display displays a main window, which displays an execution screen of the executed first application, and a first sub window which displays an execution screen of the executed second application, on a screen. A receiver receives a focus switching signal from a first remote controller and a coupling signal containing an identifier from a second remote controller. The controller switches a focused window based on the received focus switching signal and implements coupling between the focused window and the second remote controller in response to the received coupling signal. | 11-29-2012 |
20120299816 | HYBRID DISPLAY APPARATUS AND DISPLAY METHOD THEREOF - A hybrid display apparatus and a display method thereof are provided. In the apparatus, an emissive type display panel outputs an internal light to the outside and thereby displays data. A reflective type display panel passes the internal light to the outside, reflects an external light, and thereby displays the data. An intermediate film layer is interposed between the emissive type display panel and the reflective type display panel. The intermediate film layer passes the internal light from the emissive type display panel to the reflective type display panel, and also blocks the external light that passes through the reflective type display panel and is reflected at the emissive type display panel. | 11-29-2012 |
20120299817 | Systems and Methods of Image Processing that Adjust for Viewer Position, Screen Size and Viewing Distance - Several embodiments of image processing systems and methods are disclosed herein whereby at least one characteristic of an image displayed on a target display is changed according to information regarding the viewer's position—e.g. distance to the target display, the visual angle the target display's subtends of the viewer's field of view. In one embodiment, luminance and/or contrast may be changed depending on the information regarding viewer's position relative to the target display. | 11-29-2012 |
20120299818 | MEDICAL INFORMATION DISPLAY APPARATUS, OPERATION METHOD OF THE SAME AND MEDICAL INFORMATION DISPLAY PROGRAM - One of hierarchical images constituted by an appearance image representing an appearance of a subject body and a plurality of gradually magnified anatomy images schematically representing anatomical structures present inside of the body is displayed on a display unit. If a structure present at a desired position in the displayed image on the display unit specified by the specification input corresponds to a target examined region, the displayed image is switched to the examination information associated with the target examined region to display the examination information. | 11-29-2012 |
20120299819 | SENSOR IMAGE DISPLAY DEVICE AND METHOD - This disclosure provides a sensor image display device, which includes a display unit configured to display a sensor image generated based on a signal acquired by a sensor, a selection user interface for allowing a user to select a partial area from an area where the sensor image is displayed, and a sensor image generator for generating the sensor image by performing different signal processing for the selected area and one or more areas other than the selected area. | 11-29-2012 |
20120299820 | TOUCHLESS INTERFACES - The shape or position of an object is estimated using a device comprising one or more transmitters and one or more receivers, forming a set of at least two transmitter-receiver combinations. Signals are transmitted from the transmitters, through air, to the object. They are reflected by the object and received by the receivers. A subset of the transmitter-receiver combinations which give rise to a received signal meeting a predetermined clarity criterion is determined. The positions of points on the object are estimated using substantially only signals from the subset of combinations. | 11-29-2012 |
20120299821 | FLEXIBLE FINGERPRINT SENSOR - A flexible pressure sensor has a first set of substantially parallel conductors in the x direction, a second set of substantially parallel conductors in the y direction, and a composite material disposed between the first set and second set of conductors. The composite material is capable of returning to substantially its original dimensions on release of pressure. The composite material includes conductive particles at least partially embedded in an elastomeric layer that have no relative orientation and are disposed within the elastomeric layer for electrically connecting the first set and second set of conductors in the z direction under application of sufficient pressure there between. | 11-29-2012 |
20120299822 | Communication and Device Control System Based on Multi-Frequency, Multi-Phase Encoded Visual Evoked Brain Waves - A driving control system actuated by visual evoked brain waves which are induced by a multi-frequency and multi-phase encoder, the driving control system includes an optical flash generating device, a brain wave signal measurement device, a signal processing and analyzing device and a control device. The brain wave signal measurement device is configured for measuring a steady-state visual evoked response (SSVER) signal inducing by a user gazing the flash light source generated by the optical flash generating device. The signal processing and analyzing device is configured for calculating the frequency parameter and the phase parameter of the SSVER signal by a mathematical method, and analyzing whether those parameters are same as the optical flash generating device's parameters so as to generate a judgment result. The control device generates a control command according to the judgment result for controlling at least one of peripheral equipments. | 11-29-2012 |
20120299823 | PROJECTION CONTROLLING APPARATUS AND PROJECTION CONTROLLING METHOD - According to an aspect, a projection controlling apparatus for causing a projector to project an image includes a detector, an extractor, and a projection unit. The detector detects a point directed to a first image projected by the projector. The extractor extracts a first object contained in the first image based on the point. The projection unit causes the projector to project a second image that is an image with a pointing index added to the first object in the first image. | 11-29-2012 |
20120299824 | INFORMATION PROCESSING DEVICE, PORTABLE DEVICE AND INFORMATION PROCESSING SYSTEM - To take security into account and increase user friendliness, an information processing device includes: an input unit to which information is input; an extracting unit extracting predetermined words from the information input to the input unit; a classifying unit classifying the words extracted by the extracting unit into first words and second words; and a converting unit converting the first words by a first conversion method and converting the second words by a second conversion method, the second conversion method being different from the first conversion method. | 11-29-2012 |
20120299825 | MOBILE DEVICE AND DISPLAY CONTROL METHOD - According to an aspect, a mobile device includes a housing, a display unit, a storage unit, and a control unit. The housing is configured to be switched to an opened state or a closed state. The display unit displays a standby screen. The display unit is exposed in any state of the opened state and the closed state of the housing. The storage unit stores a first object and a second object to be displayed on the standby screen. The control unit causes the first object to be displayed in a first state where the standby screen is displayed in the closed state, and the second object to be displayed in a second state where the standby screen is displayed in the opened state. | 11-29-2012 |
20120306734 | Gesture Recognition Techniques - In one or more implementations, a static geometry model is generated, from one or more images of a physical environment captured using a camera, using one or more static objects to model corresponding one or more objects in the physical environment. Interaction of a dynamic object with at least one of the static objects is identified by analyzing at least one image and a gesture is recognized from the identified interaction of the dynamic object with the at least one of the static objects to initiate an operation of the computing device. | 12-06-2012 |
20120306735 | THREE-DIMENSIONAL FOREGROUND SELECTION FOR VISION SYSTEM - A method for controlling a computer system includes acquiring video of a subject, and obtaining from the video a time-resolved sequence of depth maps. An area targeting motion is selected from each depth map in the sequence. Then, a section of the depth map bounded by the area and lying in front of a plane is selected. This section of the depth map is used for fitting a geometric model of the subject. | 12-06-2012 |
20120306736 | SYSTEM AND METHOD TO CONTROL SURVEILLANCE CAMERAS VIA A FOOTPRINT - A system includes a video sensing device, a computer processor coupled to the video sensing device, and a display unit coupled to the computer processor. The system is configured to display on the display unit a footprint of the video sensing device in an environment, receive input from a user that directly alters the footprint of the video sensing device, calculate a change in one or more of a pan, a tilt, and a zoom of the video sensing device as a function of the direct alteration of the footprint, alter one or more of the pan, the tilt, and the zoom of the video sensing device as a function of the calculations, and display a field of view of the video sensing device on the display unit as a function of the altered pan, tilt, and zoom of the video sensing device. | 12-06-2012 |
20120306737 | GESTURE-BASED PRIORITIZATION OF GRAPHICAL OUTPUT ON REMOTE DISPLAYS - The disclosed embodiments provide a system that drives a remote display from an electronic device. The electronic may be a mobile phone, a tablet computer, a personal digital assistant (PDA), and/or a portable media player. During operation, the system uses the electronic device to obtain user input associated with a transition in graphical output on the electronic device and the remote display. Next, the system identifies a region of interest in the remote display based on the user input and a usage context associated with the graphical output. Finally, the system facilitates viewing of the transition on the remote display by prioritizing transmission of the graphical output from the electronic device to the remote display based on the region of interest. | 12-06-2012 |
20120306738 | IMAGE PROCESSING APPARATUS CAPABLE OF DISPLAYING OPERATION ITEM, METHOD OF CONTROLLING THE SAME, IMAGE PICKUP APPARATUS, AND STORAGE MEDIUM - An image processing apparatus which is capable of preventing the display of an operation item, such as an icon, from hindering user's viewing an image which the user desires to view. A distance and position-detecting section and a system controller detect a distance between a screen and an operation element. The system controller displays at least one operation candidate item on the screen when the detected distance becomes not larger than a first threshold distance. Further, when the detected distance becomes not larger than the first predetermined threshold distance, the system controller changes a display form of the operation candidate item depending on the detected distance and displays the operation candidate item as an operable item. | 12-06-2012 |
20120306739 | INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING APPARATUS, STORAGE MEDIUM HAVING INFORMATION PROCESSING PROGRAM STORED THEREON, AND CONTENT PLAYBACK CONTROL METHOD - An example information processing system includes a stationary display device, and a portable display device on which a predetermined input can be made by a user. A content item is played and displayed on the stationary display device, and while the content item is being played, the playback image of the content item and a user interface image used for specifying a content item to be played are selectively displayed on the portable display device. | 12-06-2012 |
20120306740 | INFORMATION INPUT DEVICE USING VIRTUAL ITEM, CONTROL METHOD THEREFOR, AND STORAGE MEDIUM STORING CONTROL PROGRAM THEREFOR - An information input device that enables a user to input information easily by a single hand. The information input device inputs information using a virtual item displayed on a display unit. An image pickup unit shoots an indicator that operates the virtual item continuously to obtain indicator image data. A display control unit displays an indicator image corresponding to the indicator image data on the display unit. A setting unit sets, when detecting an action of the indicator to an element included in the virtual item displayed on the display unit, information corresponding to the element concerned as input information. | 12-06-2012 |
20120306741 | System and Method for Enhancing Locative Response Abilities of Autonomous and Semi-Autonomous Agents - A computer system and method according to the present invention can receive multi-modal inputs such as natural language, gesture, text, sketch and other inputs in order to simplify and improve locative question answering in virtual worlds, among other tasks. The components of an agent as provided in accordance with one embodiment of the present invention can include one or more sensors, actuators, and cognition elements, such as interpreters, executive function elements, working memory, long term memory and reasoners for responses to locative queries, for example. Further, the present invention provides, in part, a locative question answering algorithm, along with the command structure, vocabulary, and the dialog that an agent is designed to support in accordance with various embodiments of the present invention. | 12-06-2012 |
20120306742 | ELECTRONIC DEVICE, IMAGE ACQUISITION EQUIPMENT AND IMAGE ACQUISITION CONTROL METHOD - Embodiments of the present invention provide an electronic device, image acquisition equipment, and image acquisition control method. The electronic deice includes a main board, and an image acquisition equipment and a processor which are connected with the main board, wherein the image acquisition equipment has a first mode and a. second mode, the image acquisition equipment including: a first imaging unit array for image acquisition in the first mode; and a second imaging unit array for image acquisition in the second mode; the first imaging unit array and the second imaging unit array are different; an image acquisition control module is set in the processor, and is used for controlling the image acquisition equipment to switch to the first mode or the second, mode according to a mode switching instruction. The embodiments of the present invention can realize optimization of all applications, and provide optimal images for every application with a lower computational complexity, a lower power consumption caused by computation and a higher processing speed. | 12-06-2012 |
20120306743 | DYNAMIC THEME COLOR PALETTE GENERATION - There is provided a method of changing a theme for a user interface of a computer system comprising receiving an identification of an image with which to define a color palette of a theme for rendering elements of a user interface on a color display of the computer system; analysing the image to determine at least one predominant color; and defining the color palette in response to the analysis. The image may comprise a background image selected by a user for display by the computer system. Dynamic generation of the color palette matches the user interface to colors to provide flexible and appealing themes. A computer readable memory having recorded thereon instructions to carry out this method is also provided, as well as a device comprising such memory. | 12-06-2012 |
20120313847 | METHOD AND APPARATUS FOR CONTEXTUAL GESTURE RECOGNITION - Methods, apparatuses and computer program products are provided for facilitating interaction via motion gestures. A method may include receiving an indication of at least one motion gesture made with a device. The method may further include determining a contextual state of a device. The method may additionally include determining, by a processor, a relationship between the at least one motion gesture and each of a plurality of predefined motion gestures and causing, based at least in part on the determined relationship, the device to perform an action associated with a respective predefined gesture. Corresponding apparatuses and computer program products are also provided. | 12-13-2012 |
20120313848 | Three Dimensional User Interface Session Control - A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state. | 12-13-2012 |
20120313849 | DISPLAY APPARATUS AND METHOD FOR EXECUTING LINK AND METHOD FOR RECOGNIZING VOICE THEREOF - A display apparatus and a method for executing a link and a method for recognizing a voice thereof are provided. The method for executing a link of the display apparatus includes displaying a user interface, determining a text included in a link included in the user interface, displaying the text determined in the link to be distinguished from other texts, recognizing a voice input from a user, and if the voice uttered by the user matches the text determined in the link, executing the link associated with the matching text. Accordingly, possibility of misrecognition of a voice input by the user is reduced, and the user controls a display apparatus using more exact voice recognition. | 12-13-2012 |
20120313850 | DISPLAY APPARATUS - A display apparatus includes: a light emitting unit which emits light; a light scanning unit which includes a light reflector and scans the light in first and second directions; an amplitude changing unit which changes an amplitude of the swing of the light reflector; and a light emitting control unit which adjusts an amount of light of the light emitting unit, wherein the amplitude changing unit allows a first state where the light is scanned in a first region of the display surface and a second state where the light is scanned on the first region and a second region of the display surface is switched, and wherein the light emitting control unit allows an amount of light per unit area of the first region in the first state and an amount of light per unit area of the first region in the second state to be equalized. | 12-13-2012 |
20120313851 | INFORMATION PROCESSING APPARATUS AND PROGRAM - An information processing apparatus includes an imaging unit, a display, a detection unit, and an image generation unit. The imaging unit is configured to capture an image to acquire a captured image. The display has a display surface that faces in the same direction as an imaging direction of the imaging unit. The detection unit is configured to perform imaging processing on the captured image to detect a face region that is a region of a face of a user in the captured image. The image generation unit is configured to generate a display image displayed on the display based on a result of the detection by the detection unit. | 12-13-2012 |
20120319937 | PRESSURE DETECTING USER INPUT DEVICE - A pressure sensitive device and operating method thereof that detects a location and magnitude of an applied pressure. The device has a base and a deformable surface that enclose a deformable material that has an electrical impedance. Impedance measurements are made at multiple points along the base and the deformable surface. A contour of the deformable surface, caused by the applied pressure, is estimated. The position of the pressure relative to the center of the device is estimated based on contour and is interpreted as a user input specifying a direction. The amount of applied pressure is also estimated based on contour and is interpreted as a user input specifying a magnitude. The user input specifying location direction and magnitude is provided to processing for any application, such as controlling a user interface. | 12-20-2012 |
20120319938 | HAPTIC THEME FRAMEWORK - A haptic theme system is provided that can create a haptic theme, where a haptic theme is an installable package that includes one or more haptic effects, and a mapping of the one or more haptic effects to one or more user interface (“UI”) events of a device. The haptic theme can be installed on the device, and the device can then dynamically load and play a haptic theme in real-time. The haptic theme system can display one or more haptic themes within a user interface. Upon receiving a selection, the haptic theme system can generate haptic feedback based on the haptic effect that is mapped to a received user interface event within the mapping. | 12-20-2012 |
20120319939 | PARAMETER INPUT APPARATUS AND IMAGE FORMING APPARATUS - A parameter input apparatus for inputting a parameter, including: a control portion configured to display, in a display portion, an operational screen for inputting the parameter; and an operating portion for operating the operational screen, wherein the control portion is configured to display, in the display portion, the operational screen that includes a target area which is a target of an operation by the operating portion and is configured such that a view of the target area and a view of the operating portion are made uniform when the operational screen is displayed. | 12-20-2012 |
20120319940 | Wearable Digital Input Device for Multipoint Free Space Data Collection and Analysis - A new computer wearable input device, referred to as Imagine, may be used to control electronic devices in a natural, intuitive, convenient and comfortable manner, having a form factor which does not impede normal daily or business activities. For example, an Imagine may serve as an alternative to input devices such as a mouse, keyboard, or game controller. An Imagine device is able to recognize complex gestures, such as a person signing American Sign Language. An Imagine device may include a plurality of motion sensors affixed to a user's fingers and a plurality of motion sensors affixed to a user's wrists, a processing component and a communication component designed to communicate with a second electronic device. | 12-20-2012 |
20120319941 | INTERACTIVE INPUT SYSTEM AND METHOD OF OPERATING THE SAME - A method of operating an interactive input system comprises capturing images of a region of interest at a first frame rate; processing a first pixel subset of images captured at the first frame rate to detect the presence of an object; and if an object is detected, capturing images of the region of interest at a second frame rate. | 12-20-2012 |
20120319942 | DISPLAY APPARATUS FOR SETTING REMOTE CONTROLLER DEVICE AND DISPLAYING METHOD THEREOF - A display apparatus, a displaying method thereof, and a remote controller device are provided, the display method including: a displaying method of a display apparatus which is connected to at least one external device and which communicates with a remote controller, the displaying method including: acquiring selection conditions for the at least one external device connected to the display apparatus; and if an external device from among the at least one external device is selected according to the selection conditions, displaying a first user interface (UI) comprising a setting code for setting the selected external device as a control object of the remote controller device, wherein the setting code comprises code values corresponding to interface types of the display apparatus and the selected external device. | 12-20-2012 |
20120319943 | INFORMATION PROCESSING DEVICE, DISPLAY CONTROL METHOD, PROGRAM AND RECORDING MEDIUM - [Issues] An aim of the present invention is to provide information processing apparatus, display control method, program and computer-readable recording medium to improve the operabilities in accordance with the open/close angle of the device. | 12-20-2012 |
20120319944 | CONTROL SYSTEM EQUIPPED WITH PROGRAMMABLE DISPLAY, PROGRAMMABLE DISPLAY, AND DRAWING DATA GENERATION MEANS - A control system including, a programmable display, and an external device that is connected to the programmable display, wherein the external device stores specific display control information into a first device for each display designation information in correspondence with the display designation information; and wherein the programmable display includes a first memory block that stores the display designation information and all display-specific communication setting information, and a control block that makes an access to the first device of the external device according to the display designation information and the all display-specific communication setting information, to thus acquire its specific display control information corresponding to the display designation information, that stores the acquired specific display control information into a second memory block, and that controls the programmable display according to the specific display control information stored in the second memory block. | 12-20-2012 |
20120319945 | SYSTEM AND METHOD FOR REPORTING DATA IN A COMPUTER VISION SYSTEM - Embodiments of the present invention disclose a system and method for reporting data in a computer vision system. According to one embodiment, the presence of an object is detected within a display area of a display panel via at least one three-dimensional optical sensor. Measurement data associated with the object is received, and processor extracts at least one set of at least seven three-dimensional target coordinates from the measurement data. Furthermore, a control operation for the computer vision system is determined based on the at least one set of target coordinates. | 12-20-2012 |
20120319946 | METHOD AND SYSTEM FOR THREE DIMENSIONAL INTERACTION OF A SUBJECT - Method, computer program and system for tracking movement of a subject. The method includes receiving data from a distributed network of camera sensors employing one or more emitted light sources associated with one or more of the one or more camera sensors to generate a volumetric three-dimensional representation of the subject, identifying a plurality of clusters within the volumetric three-dimensional representation that correspond to motion features indicative of movement of the motion features of the subject, presenting one or more objects on one or more three dimensional display screens, and using the plurality of fixed position sensors to track motion of the motion features of the subject and track manipulation of the motion features of the volumetric three-dimensional representation to determine interaction of one or more of the motion features of the subject and one or more of the one or more objects on the three dimensional display. | 12-20-2012 |
20120326958 | DISPLAY AND USER INTERFACE - A touchless method for registering commands from a display (e.g. reconfigurable display) may include any of various components. The method may use a light sensor in front of or behind the display to detect light reflected by a user's finger approaching a control option displayed on the display. Light used to display images may be provided at a frequency and/or time that can be identified by a processor connected to the light sensor, or can possess some other unique property (e.g. color) which may be distinguished by the processor. | 12-27-2012 |
20120326959 | REGION OF INTEREST SEGMENTATION - A sensor manager provides dynamic input fusion using thermal imaging to identify and segment a region of interest. Thermal overlay is used to focus heterogeneous sensors on regions of interest according to optimal sensor ranges and to reduce ambiguity of objects of interest. In one implementation, a thermal imaging sensor locates a region of interest that includes an object of interest within predetermined wavelengths. Based on the thermal imaging sensor input, the regions each of the plurality of sensors are focused on and the parameters each sensor employs to capture data from a region of interest are dynamically adjusted. The thermal imaging sensor input may be used during data pre-processing to dynamically eliminate or reduce unnecessary data and to dynamically focus data processing on sensor input corresponding to a region of interest. | 12-27-2012 |
20120326960 | SCANNING TECHNOLOGY - Scanning technology, in which one or more electronic communications that designate a scan area of an object are received from an input apparatus. A preview scan area is displayed based on the received one or more electronic communications that designate the scan area of the object. | 12-27-2012 |
20120326961 | GESTURE BASED USER INTERFACE FOR AUGMENTED REALITY - Technologies are generally described for systems and methods effective to provide a gesture keyboard that can be utilized with a virtual display. In an example, the method includes receiving sensory information associated with an object in proximity to, or in contact with, an input device including receiving at least one level of interaction differentiation detected from at least three levels of interaction differentiation, interpreting a command from the sensory information as a function of the at least one level of interaction differentiation, and outputting an action indication based on the command. | 12-27-2012 |
20120326962 | Data Processing Device - When a first device and a first attribute are specified, a data processing device registers the first device in association with first attribute and transmits an instruction to perform a process relating to the first attribute to the first device. If the first device possesses the second attribute, the data processing device registers the first device in association with the second attribute. The data processing device displays, on a display unit, a first image representing identification data of the first device when the second attribute is specified and the first device is registered in association with the second attribute. When the first image is selected, the data processing device transmits an instruction to perform a process relating to the second attribute to the first device corresponding to the selected first image. | 12-27-2012 |
20120326963 | FAST FINGERTIP DETECTION FOR INITIALIZING A VISION-BASED HAND TRACKER - Systems and methods for initializing real-time, vision-based hand tracking systems are described. The systems and methods for initializing the vision-based hand tracking systems image a body and receive gesture data that is absolute three-space data of an instantaneous state of the body at a point in time and space, and at least one of determine an orientation of the body using an appendage of the body and track the body using at least one of the orientation and the gesture data. | 12-27-2012 |
20120326964 | INPUT DEVICE AND COMPUTER-READABLE RECORDING MEDIUM CONTAINING PROGRAM EXECUTED BY THE INPUT DEVICE - A character input device, including: a display control section to display, in a first region, an operational-element group composed of operational elements corresponding to characters and to display, in a second region, another operational-element group composed of operational elements corresponding to characters, the characters corresponding to the respective operational-element groups displayed in the first and the second regions being different in type; a first input processing section to perform, upon detection of an operation on the first region, input processing of a character specified by the operation, among the characters to which the operational elements of the operational-element group displayed in the first region correspond; and a second input processing section to perform, upon detection of an operation on the second region, input processing of a character specified by the operation, among the characters to which the operational elements of the operational-element group displayed in the second region correspond. | 12-27-2012 |
20120326965 | METHODS AND APPARATUS FOR PROCESSING COMBINATIONS OF KINEMATICAL INPUTS - Methods and apparatus for processing combinations of force and velocity data generated over a given period of time. In one embodiment, an input device comprising one or more force sensors and one or more motion sensors is manipulated relative to a surface. A receiving system is adapted to receive input sequences or “gestures” which are triggered upon the occurrence of one or more conditions detected by the input device. An application executing in the receiving system may be implemented such that the system responds differently to each specific gesture provided by the user. | 12-27-2012 |
20120326966 | GESTURE-CONTROLLED TECHNIQUE TO EXPAND INTERACTION RADIUS IN COMPUTER VISION APPLICATIONS - The invention describes a method and apparatus to expand a radius of the interaction in computer vision applications with the real world within the field of view of a display unit of a device. The radius of interaction is expanded using a gesture in front of an input sensory unit, such as a camera that signals the device to allow a user the capability of extending and interacting further into the real and augmented world with finer granularity. In one embodiment, the device electronically detects the gesture generated by the user's extremity as obtained by the camera coupled to the device. In response to detecting the gesture, the device changes the shape of a visual cue on the display unit coupled to the device, and updates the visual cue displayed on the display unit. | 12-27-2012 |
20120326967 | VEHICULAR GLANCE LIGHTING APPARATUS AND A METHOD FOR CONTROLLING THE SAME - The present invention provides a vehicular glance lighting apparatus which provides a driver with vehicle-running information as light of a background screen and includes a driver attention function through prompt information delivery such that the driver visually recognizes as a background screen the vehicle-running information needed essentially or minimally for running the vehicle without requiring the driver's attention to avoid dissipation of the driver's sight, thereby preventing the driver's driving attention from being diverted, and a method for controlling the same. The vehicle-running information related with the running of a vehicle to be outputted in the form of light as a background screen, so that a driver can perceive the vehicle-running information without diverting or obstructing the driver's driving attention, thereby enhancing the driver's ability to cope with a vehicle travel risk during the traveling of the vehicle. | 12-27-2012 |
20120326968 | CONTROL APPARATUS AND METHOD, RECORDING MEDIUM AND PROGRAM - The present invention relates to a control apparatus and a method, a recording medium and a program, which enable to control at least one first device more efficiently and quickly through the use of a second device. The control apparatus detects the at least one first device, requests at least one first operation panel information corresponding to the at least one first device from the second device; displays the at least one first operation panel; and controls the at least one first device. | 12-27-2012 |
20120326969 | IMAGE SLIDESHOW BASED ON GAZE OF A USER - Provided is a method to enable an image slideshow based on gaze of a user. The method displays an image to a user for viewing, wherein, invisible to the user, each image is divided into a grid of tiles. While the user is viewing the image, the method detects the gaze of the user to identify regions of the image that are of interest to the user. The identified regions of interest are mapped to the grid of tiles on the image, to recognize tiles of interest to the user. The tiles of interest for the image are calculated and another image is presented to the user for viewing when the number of tiles of interest exceeds a threshold value previously computed for the user. | 12-27-2012 |
20120326970 | ELECTRONIC DEVICE AND METHOD FOR CONTROLLING DISPLAY OF ELECTRONIC FILES - An electronic device displays electronic files on a display device. When a user views the electronic device, a video camera captures a real-time video consisting of a plurality of frames of the user. The electronic device recognizes a face region in each frame and a lip outline in the face region of the frame, and generates a lip shape variation video of a lip of the user according to the lip outline in each frame and a capturing time of each frame. Furthermore, the electronic device searches a preset lip-language video that are pre-stored in a storage device and matches the lip shape variation video, and controls display of the electronic files by executing an voice command associated with the matched preset lip-language video. | 12-27-2012 |
20120326971 | PORTABLE INFORMATION TERMINAL AND SCREEN DISPLAY CONTROL METHOD - A portable information terminal includes: a first housing including display section; a second housing connected to the first housing, an open angle that is an angle formed by the second housing and a plane of display section being changeable; angle detection section whose output value changes according to a operation in which the first housing and the second housing are opened or closed; and control section that causes display section to execute a first display process if the output value of angle detection section is equal to or greater than a threshold and to execute a second display process if the output value is less than the threshold. One of the first display process and the second display process includes a process that causes information displayed on display section to become illegible compared to information displayed on display section that executes the other display process. | 12-27-2012 |
20120326972 | MULTI-TASK INTERACTIVE WIRELESS TELECOMMUNICATIONS DEVICE - A portable wireless telecommunications device has electronic computer visual display includes at least one central display screen and at least two additional screens respectively disposed foldably to the right and left of said central screen, for displaying simultaneous, multiple images to a user in a super video graphics array (SVGA). The foldable display has at least two mutually connected foldable sub-display units, and it includes a user attachable and detachable connector, for user-assembling of the sub-display units into mutual connection with each other. Each of the two sub-display units have user-deployable supports (e.g. a rigid angular support member) for maintaining the sub-display units in an upwardly projecting disposition during use. | 12-27-2012 |
20120326973 | DISPLAY DEVICE - A portable phone ( | 12-27-2012 |
20120326974 | TERMINAL DEVICE, METHOD FOR SETTING SAME, AND COMMUNICATION SYSTEM - A terminal device includes reading means and setting means. The reading means reads setup information added to image data from the image data. The setting means implements a setting on the basis of the setup information read by the reading means. | 12-27-2012 |
20120326975 | INPUT DEVICE AND INPUT METHOD - The present invention discloses an input device and an input method. The input device comprises: an image acquiring device for acquire sequence of images when an object moves in front of the field of view of the image acquiring device; and a processor for generating a rotation signal according to the rotation status of the object. | 12-27-2012 |
20120326976 | Directed Performance In Motion Capture System - Techniques for enhancing the use of a motion capture system are provided. A motion capture system tracks movement and audio inputs from a person in a physical space, and provides the inputs to an application, which displays a virtual space on a display. Bodily movements can be used to define traits of an avatar in the virtual space. The person can be directed to perform the movements by a coaching avatar, or visual or audio cues in the virtual space. The application can respond to the detected movements and voice commands or voice volume of the person to define avatar traits and initiate pre-scripted audio-visual events in the virtual space to provide an entertaining experience. A performance in the virtual space can be captured and played back with automatic modifications, such as alterations to the avatar's voice or appearance, or modifications made by another person. | 12-27-2012 |
20120326977 | Hybrid Control Of Haptic Feedback For Host Computer And Interface Device - A hybrid haptic feedback system in which a host computer and haptic feedback device share processing loads to various degrees in the output of haptic sensations, and features for efficient output of haptic sensations in such a system. A haptic feedback interface device in communication with a host computer includes a device microcontroller outputting force values to the actuator to control output forces. In various embodiments, the microcontroller can determine force values for one type of force effect while receiving force values computed by the host computer for a different type of force effect. For example, the microcontroller can determine closed loop effect values and receive computed open loop effect values from the host; or the microcontroller can determine high frequency open loop effect values and receive low frequency open loop effect values from the host. Various features allow the host to efficiently stream computed force values to the device. | 12-27-2012 |
20130002531 | Portable Electronic Device Having Interchangeable User Interfaces and Method Thereof - A portable electronic device comprising a housing, and a first user interface, a second user interface and one or more sensors supported by the housing, and a method thereof. The first user interface has an active state, and the second user interface has an inactive state while the first user interface is in the active state. The sensor or sensors detect an environmental condition. The second user interface changes from the inactive state to the active state and the first user interface changes from the active state to the inactive state in response to one or more sensors detecting the environmental condition. For another embodiment, the device may detect an energy level of a power source of the portable electronic device. The second user interface of the portable electronic device is then activated and the first user interface of the portable electronic device is deactivated in response to detecting the energy level of the power source. | 01-03-2013 |
20130002532 | METHOD, APPARATUS, AND COMPUTER PROGRAM PRODUCT FOR SHARED SYNCHRONOUS VIEWING OF CONTENT - Provided herein is a technique by which content may be shared with a remote user. An example method may include providing for display of content on a first device, synchronizing content between the first device and a second device, providing for display of an image captured by the second device on the first device, and providing for presentation of audio captured by the second device by the first device. The content may include an image of a page of a book. Synchronizing content between the first device and the second device may include directing advancing of a page on the second device in response to receiving an input directing the advancing of a page on the first device. Providing for display of an image captured by the second device on the first device may include providing for display of a video captured by the second device on the first device. | 01-03-2013 |
20130002533 | USER EXPERIENCE - Example embodiments relate to systems, methods, apparatuses, and computer readable media relating to user interface, that may for example, receive and/or process physical activity data and allow interaction with the received information in novel implementations. | 01-03-2013 |
20130002534 | Systems and Methods for Controlling a Cursor on a Display Using a Trackpad Input Device - Systems and methods for controlling a cursor on a display using a trackpad input device are disclosed. The systems and methods may be directed to controlling the cursor on a display separate from the trackpad input device, based on information identified about a motion of a trackpad input device or a computing device. A conversion factor may be determined to relate input to the trackpad input device with control of the cursor on the display in response to the input. The conversion factor can be adjusted when the motion information indicates that the trackpad input device or computing device is in motion. An input signal from an input to the trackpad input device may be smoothed by filtering out a mechanical vibration signal within the input signal. The input signal may also be smoothed by subtracting the absolute motion of the trackpad input device from the input signal. | 01-03-2013 |
20130002535 | OPTICAL POSITION DETECTION DEVICE AND DISPLAY SYSTEM WITH INPUT FUNCTION - In an optical position detection device, when emitting the detection light from the light source section, the light receiving section receives the reflected light from the target object with the first light receiving element and the second light receiving element. The first light receiving element and the second light receiving element are arranged to have the intersection angle greater than 90° and smaller than 180°, the intersection angle being formed between the normal direction with respect to the light receiving surface of the first light receiving element and the normal direction with respect to the light receiving surface of the second light receiving element. | 01-03-2013 |
20130002536 | INDICATION MEMBER, OPTICAL POSITION DETECTION DEVICE, AND DISPLAY SYSTEM WITH INPUT FUNCTION - A detectable indication member for an optical position detection device includes a round bar shaped shaft and a spherical body provided at the distal end of the shaft. The outer peripheral surface of the spherical body and the outer peripheral surface of an end portion of the shaft portion connected to the spherical body form a retroreflective portion. A portion of the shaft adjacent the base end of the end portion absorbs infrared light. | 01-03-2013 |
20130002537 | IMAGING APPARATUS, IMAGING APPARATUS CONTROL METHOD, AND COMPUTER PROGRAM - An imaging apparatus includes: a control unit configured to move a focusing lens, and detect a focus position; wherein the control unit executes auto-focus (AF) scan processing in which only a part of a range of movement of the focusing lens is set as a scan range, as first scan processing, and executes auto-focus (AF) scan processing in which a region including a region differing from the scan region of the first scan processing is set as a scan range, as second scan processing, in the event that a focus point is not detected in the first scan processing. | 01-03-2013 |
20130002538 | GESTURE-BASED USER INTERFACE FOR A WEARABLE PORTABLE DEVICE - A gesture-based user interface comprises a wearable portable device storing a gesture profile for each of a plurality of different applications on the wearable portable device to define different gestures for the different applications, wherein each of the gesture profiles includes at least one gesture and a predetermined function associated with the at least one gesture; and a profile web service, wherein the portable device is in communication with the profile web service and is configured to download from the profile web service to the wearable portable device a customizable gesture profile for a particular one of the applications, wherein the customized gesture profile modifies at least one of the different gestures, the customized gesture profile comprising personal preferences of the user regarding the modified gesture, including physical attributes of the user. | 01-03-2013 |
20130002539 | SYSTEM AND METHOD FOR INTERACTING WITH A DISPLAY - A system and method of interacting with a display. The method comprises recognizing a disturbance in a display zone of a projected image and displaying a selected state in response to the recognized disturbance. The method further includes recognizing a gesture which interrupts a light source and is associated with an action to be taken on or associated with the displayed selected state. An action is executed in response to the recognized gesture. The system includes a server having a database containing data associated with at least one or more predefined gestures, and at least one of a hardware and software component for executing an action based on the at least one or more predefined gestures. | 01-03-2013 |
20130002540 | OPERATION INFORMATION GENERATION DEVICE - According to an embodiment, an operation information generation device includes a receiver configured to receive content from a server connected to a first network; a processor configured to decode the content; a display control unit configured to display the decoded content on a displaying unit; a reception unit configured to receive an operation from a user; and an operation information generation unit configured to execute an operation information generation process of generating operation information indicative of substance of the operation received for the content displayed on the displaying unit. | 01-03-2013 |
20130002541 | IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD AND PROGRAM - Embodiments of the technology involve apparatus and methods for control of displaying of images. In an example, an apparatus may include an image display, a sensor to detect posture of the image display and a processor to control sequentially displaying images of a group of images on the image display based on changes in the detected posture. The processor may control a display of a posture indicator on the image display such that the indicator may represent a relation between a change in the detected posture and an image of the group of images. Optionally, the indicator may be represented by a tilt meter. Moreover, in some embodiments, the sensor may be implemented with a gyroscopic sensor. | 01-03-2013 |
20130002542 | COORDINATE INPUT DEVICE AND PROGRAM - With respect to a coordinate designation device capable of detecting simultaneous operational input for a plurality of coordinates, occurrences of erroneous operations during textual input, etc., are suppressed. An elapsed time from a previous up operation up to a new down operation is calculated, and, if the calculated elapsed time is equal to or less than a threshold, an event for the input information is issued at the detected coordinates of the down operation. In addition, a traveled distance from coordinates at the time of starting input up to the current coordinates is calculated, and, if the calculated traveled distance is equal to or greater than a threshold, an event for the input start point is issued at the coordinates at the time of starting input. | 01-03-2013 |
20130002543 | INTERACTIVE PRESENTATION SYSTEM OF ELECTRONIC READING DEVICE WITH ON-PAPER WINDOW SYSTEM - The invention relates to an electronic reading device that can be used in combination with print media, and more particularly, to an interactive presentation system for an electronic reading device having an on-paper window system. The interactive presentation system is composed of a plurality of electronic reading device. One of the electronic reading devices is set as a leading device, and the others of the electronic reading devices are set as auxiliary devices. The interactive presentation system can read various data from media, having information messages, and an electronic coordinate circuit, respectively. The data can be transmitted to a circuit substrate for a central processor, which is provided with a storage, and then processed into combined data of electronic files. Thereby, associated information can be readily searched for and read using a connected electronic reader or a projector for projecting it to a projecting board. | 01-03-2013 |
20130002544 | INPUT DEVICE, METHOD AND MEDIUM - An input device comprises: a detection unit that detects as detection data an oscillation generated by tapping a body of a user and transmitted via the body; and an input information identification unit that refers to the detection data, identifies a tap site based on a fact that the detection data varies depending on a physical property of a body tissue associated with the tap site, and outputs an operation command allocated to the identified tap site. | 01-03-2013 |
20130009858 | SYSTEMS AND METHODS FOR LOCKING AN ELECTRONIC DEVICE - Systems and methods for locking an input device of an electronic device are described herein. An example method includes detecting a moving action of a housing of the electronic device from an open position to a closed position. The method includes detecting at least a first condition of the electronic device after detection of the moving action from the open position to the closed position and locking the input device upon detection of the first condition within a first time interval based on the electronic device being moved to the closed position. | 01-10-2013 |
20130009859 | STEREOSCOPIC IMAGE DISPLAY DEVICE AND DRIVING METHOD THEREOF - A stereoscopic image display device and a driving method thereof, which correct the viewing position of a viewer in initial driving of a 3D display mode, are discussed. The stereoscopic image display device includes a display module, a barrier module, a position detector, and a position detector. The display module separates a left-eye image and a right-eye image to display a stereoscopic image. The barrier module is disposed in correspondence with the display module, and forms a light transmitting area for transmitting the left-eye image and right-eye image and a light blocking area for blocking the left-eye image and right-eye image. The position detector detects position information on a viewer which views the stereoscopic image displayed on the display module. The position detector corrects positions of the light transmitting area and light blocking area on the basis of viewing position information on the viewer. | 01-10-2013 |
20130009860 | INFORMATION DISPLAY APPARATUS - An information display apparatus includes: an outputting device capable of (i) displaying a first image in which data values corresponding to at least a first portion out of a plurality of data values included in one dataset of a plurality of datasets are indicated by characters and in which at least one portion of the data values can be changed and (ii) displaying a second image in which data values corresponding to at least a second portion out of the plurality of data values included in the one dataset are graphically indicated and in which at least one portion of the data values can be changed; and a controlling device for changing at least one portion of the data values included in the one dataset in accordance with a received change instruction if the instruction for one of the first image and second image is received by an inputting device. | 01-10-2013 |
20130009861 | METHODS AND SYSTEMS FOR CONTROLLING DEVICES USING GESTURES AND RELATED 3D SENSOR - Provided are computer-implemented methods and systems for controlling devices using gestures and a 3D sensor that enables implementing the above. In one embodiment, the method proposed herein may be based on defining at least one sensor area within the space surrounding a user of a controlled device; associating this sensor area with at least one user gesture; associating the combination of the user gesture and sensor area with an actionable command; identifying the direction of the line of sight of the user and a focal point of the line of sight of the user; and, if the line of sight of the user is directed a sensor area, issuing an actionable command corresponding to the combination of the sensor area and the gesture that the user makes while looking at this sensor area. | 01-10-2013 |
20130009862 | DISPLAY APPARATUS - A display apparatus including an image generator, a projection lens set, a depth detecting module detecting the position of user, and a control unit is provided, wherein the control unit is electrically connected to the image generator, the projection lens set and the depth detecting module. An image displayed by the image generator is projected through the projection lens set and generates a floating real image between the projection lens set and the user. Each beam forming the floating real image has a light-cone angle θ. The image generator and the projection lens adjust the position of the floating real image according to the position of user. The size of the floating real image is L, the distance between two eyes of the user is W, the distance between the user and the floating real image is D, and the light-cone angle θ satisfies the formula of | 01-10-2013 |
20130009863 | DISPLAY CONTROL APPARATUS, DISPLAY CONTROL METHOD, AND PROGRAM - A display control apparatus controls display of a transparent display which includes a screen configured to transmit light arriving from an object located on a side of a second surface so that the object is viewable from a viewpoint located on a side of a first surface which is an opposite surface to the second surface. The display control apparatus includes: an acquisition unit that acquires position information indicating relative positional relations between the transparent display and the viewpoint and between the transparent display and the object; and a display control unit that controls the display of the transparent display based on the position information. | 01-10-2013 |
20130009864 | METHOD AND APPARATUS FOR INTERFACING BETWEEN EXTERNAL DEVICE AND MOBILE DEVICE - An apparatus for interfacing between a mobile device and an external device is provided. The apparatus includes a mobile device which includes image data for the external device, and which transmits to the external device an image signal corresponding to the image data according to a selection, and an external device which displays the image data corresponding to the image signal received from the mobile device, which generates an input signal according to a user's input, and which transmits the generated input signal to the mobile device, wherein the image signal of the mobile device and the input signal of the external device are transmitted and received through a single interface used for a connection between the mobile device and the external device. | 01-10-2013 |
20130009865 | USER-CENTRIC THREE-DIMENSIONAL INTERACTIVE CONTROL ENVIRONMENT - A computer-implemented method and system for controlling various electronic devices by recognition of gestures made by a user within a particular space defined in front of the user are provided. An example method may comprise generating a depth map of a physical scene, determining that a head of the user is directed towards a predetermined direction, establishing a virtual sensing zone defined between the user and a predetermined location, identifying a particular gesture made by the user within the virtual sensing zone, and selectively providing to the electronic device a control command associated with the particular gesture. The particular gesture may be performed by one or more characteristic forms provided by the user within the virtual sensing zone being in an active state. The characteristic forms are forms reliably distinguishable from casual forms by means of computer vision and having certain attributes, which can reliably reflect user intent. | 01-10-2013 |
20130009866 | INPUT PROCESSING APPARATUS - In a keyboard input device, a first input device and a second input device each including a stick pointer are arranged. An input control of gesture functions such as zoom-in, zoom-out, right rotation, left rotation, forward tracking, backward tracking, left tracking, right tracking, and the like may be made possible by a combination of operational directions of operation bodies of the stick pointer (SP | 01-10-2013 |
20130009867 | METHOD AND APPARATUS FOR DISPLAYING VIEW MODE USING FACE RECOGNITION - A method for displaying screen data according to determination of a view mode in a portable terminal, and an apparatus thereof, are provided. The method includes detecting an orientation change event of the portable terminal in a displayed state of the screen data, turning-on a camera module when the orientation change event is detected, determining an orientation of eyes of a user through face detection from an image captured by the camera module, determining a view mode of the portable terminal according to an orientation of the portable terminal and the orientation of the eyes of the user, and displaying screen data according to the determined view mode. | 01-10-2013 |
20130009868 | DISPLAY DEVICE AND DISPLAY METHOD - The present invention provides a display apparatus and a display method for realizing control for display operations by a user precisely reflecting the user's status, i.e., the user's intentions, visual state and physical conditions. Worn as an eyeglass-like or head-mount wearable unit for example, the display apparatus of the present invention enables the user to recognize visibly various images on the display unit positioned in front of the user's eyes thereby providing the picked up images, reproduced images, and received images. As control for various display operations such as switching between the display state and the see-through state, display operation mode and selecting sources, the display apparatus of the present invention acquires information about either behavior or physical status of the user, and determines either intention or status of the user in accordance with the acquired information, thereby controlling the display operation appropriately on the basis of the determination result. | 01-10-2013 |
20130009869 | System and Method for Image Processing using Multi-touch Gestures - Various embodiments of a system and methods for processing digital images using multi-touch gestures are described. A multi-touch gestural input set which comprises a plurality of touch gestures may be applied to a display of an image. The gestural input set may include different gesture types, such as mobile and stationary gestures. Each gesture type may indicate a different image processing constraint that may be applied to modify the digital image. Stationary gestures may indicate constrained regions of the image that are not subject to modification. Mobile gestures may indicate regions of the image which may be subject to modification. Characteristics of the mobile gestures, such as velocity and/or pressure, may also indicate an amount by which an image may be modified over the region indicated by the mobile gesture. Image masks, which separate foreground and background regions of an image, may also be specified by the gestural input set. | 01-10-2013 |
20130009870 | INPUT DEVICE, INPUT METHOD, AND PROGRAM - An input device comprises: a detection unit that detects a body movement generated by tapping a user body as detection data; and an input information determination unit that refers to the detection data, determines a tap position based on a fact that the detection data varies depending on the tap position, and outputs an operation command associated with the determined tap position. | 01-10-2013 |
20130016037 | REDUCING OR ELIMINATING THE BLACK MASK IN AN OPTICAL STACKAANM Wang; Hung-JenAACI Longtan TownshipAACO TWAAGP Wang; Hung-Jen Longtan Township TW - A reflective subpixel array may be formed in which an absorption layer is formed on a back substrate, which may obviate the need for a black mask on a front substrate upon which the reflective subpixel array is formed. In some implementations, the black mask layer may be formed only in post areas on the front substrate. The absorption layer may absorb light that enters between subpixel rows and/or columns. The absorption layer may include at least one highly conductive layer that can form part of the signal routing for the display. Conductive spacers may be formed to connect the conductive absorption layer to a conductive layer of the subpixel array. | 01-17-2013 |
20130016038 | MOTION DETECTION METHOD AND DISPLAY DEVICEAANM Yu; Shu-HanAACI Taoyuan CountyAACO TWAAGP Yu; Shu-Han Taoyuan County TWAANM Lin; Chia-HoAACI Hsinchu CountyAACO TWAAGP Lin; Chia-Ho Hsinchu County TW - A motion detection method for a display device includes the steps of capturing images from a position on the display device to generate a plurality of capture images, calculating a moving status of the display device according to the plurality of capture images, and adjusting a display range of the display device relative to an image data according to the moving status. | 01-17-2013 |
20130021232 | APPARATUS, SYSTEM, AND METHOD FOR PROVIDING FEEDBACK SENSATIONS OF TEMPERATURE, TEXTURE, AND HARDNESS-SOFTNESS TO A CONTROLLER - Described herein are hand-held controller, system, and method for providing real-time programmable sensations of texture, hardness-softness, and temperature (thermal) to the hand-held controller. The hand-held controller comprises a first region configured to be touched by a user and to provide a real-time programmable hardness-softness sensation to the user in response to a first trigger signal generated by an interactive program; a second region to be touched by the user and to provide real-time programmable hardness-softness sensations to the user in response to a second trigger signal generated by the interactive program; and a third region to be touched by the user and to provide real-time programmable thermal sensations to the user in response to a third trigger signal generated by the interactive program. | 01-24-2013 |
20130021233 | APPARATUS, SYSTEM, AND METHOD FOR PROVIDING FEEDBACK SENSATIONS OF TEMPERATURE AND HARDNESS-SOFTNESS TO A CONTROLLER - Described herein are hand-held controller, system, and method for providing real-time sensations of temperature and/or hardness-softness to the hand-held controller. The hand-held controller comprises a first region configured to be touched by a user and to provide a real-time computer programmable hardness-softness sensation to the user in response to a first trigger signal generated by an interactive program; and a first mechanism, coupled to the first region, to cause the first region to harden relative to a first state, and to cause the first region to soften relative to a second state. | 01-24-2013 |
20130021234 | APPARATUS, SYSTEM, AND METHOD FOR PROVIDING FEEDBACK SENSATIONS OF TEMPERATURE AND TEXTURE TO A CONTROLLER - Described herein are hand-held controller, system, and method for providing real-time sensations of temperature and texture to a user of the hand-held controller. The hand-held controller comprises a first region to be touched by a user and to provide a real-time computer programmable texture sensation to a user in response to a first trigger signal generated by an interactive program; and a first mechanism, coupled to the first region, to cause the first region to roughen relative to a first state, and to cause the first region to smooth relative to a second state. | 01-24-2013 |
20130021235 | APPARATUS, SYSTEM, AND METHOD FOR PROVIDING FEEDBACK SENSATIONS OF TEXTURE AND HARDNESS-SOFTNESS TO A CONTROLLER - Described herein are hand-held controller, system, and method for providing real-time sensations of texture and hardness-softness to a user of the hand-held controller. The hand-held controller comprises a first region to be touched by a user and to provide real-time computer programmable texture sensations to the user in response to a first trigger signal generated by an interactive program; and a second region to be touched by the user and to provide real-time computer programmable hardness-softness sensations to the user in response to a second trigger signal generated by the interactive program. | 01-24-2013 |
20130021236 | Orientation Based Application Launch System - An electronic device may include multiple faces, an application launch input element, a memory that stores multiple applications, and a processor that accesses the memory. In response to a detected trigger of the application launch input element, the processor determines the orientation of the device. For example, the processor may determine which face of the electronic device is pointed in a predetermined direction. Based on the determined orientation of the device, the processor selects and activates a specific application from the multiple available applications. | 01-24-2013 |
20130021237 | OPTICAL REMOTE CONTROL SYSTEM - An optical remote control system includes a home appliance and a remote controller. The home appliance operates according to a user command and its housing includes an opening. A status indicator light and a reference light are disposed within the opening. The status indicator light includes a visible light source, and the reference light includes a plurality of infrared light sources. The visible light source and the infrared light sources are disposed in a predetermined pattern. The remote controller includes an optical sensor configured to detect optical signals from the infrared light sources, thereby generating the user command accordingly. | 01-24-2013 |
20130021238 | SYSTEMS AND METHODS FOR ELECTRONIC DISCOVERY - A system for electronic discovery may comprise an electronic discovery platform; a two-handed controller configured to produce a plurality of signals; and a memory configured to store at least one set of controller signal relationships, the controller signal relationships being associated with the two-handed controller. A system for electronic discovery may further comprise an interface application communicatively coupled to the memory, wherein the interface application uses the at least one set of controller signal relationships to associate at least one of the plurality of signals from the two-handed controller with at least one of a plurality of discovery commands associated with the discovery of electronic data, and wherein the interface application communicates a determined discovery command such that the discovery command is executed by the electronic discovery platform. | 01-24-2013 |
20130021239 | DISPLAY DEVICE - A display device includes a display panel including a plurality of pixels, a shutter panel including a driver circuit, a liquid crystal, and light-transmitting electrodes provided in a striped manner, and a positional data detector configured to detect a positional data of a viewer. The shutter panel is provided over a display surface side of the display panel, a width of one of the light-transmitting electrodes in the shutter panel is smaller than that of one of the plurality of pixels, and the driver circuit in the shutter panel is configured to selectively output signals for forming a parallax barrier to the light-transmitting electrodes. The parallax barrier is capable of changing its shape in accordance with the detected positional data. | 01-24-2013 |
20130021240 | METHOD AND DEVICE FOR CONTROLLING AN APPARATUS AS A FUNCTION OF DETECTING PERSONS IN THE VICINITY OF THE APPARATUS - A method for controlling an electronic apparatus, includes steps of: acquiring an image of the environment of the apparatus, detecting the presence of human faces in the image acquired, estimating a respective position of each face detected in relation to the apparatus, and sending a signal to the apparatus to enable a function of the apparatus if a condition is met relating to a number of faces detected in the image and/or the estimated position of each detected face. | 01-24-2013 |
20130021241 | Portable Information Display Terminal - Disclosed is a portable information display terminal which can start an application program correspondingly to a use state even if assumed to be used in a placed state. The portable information display terminal ( | 01-24-2013 |
20130027289 | ELECTRONIC DEVICE - A mobile terminal is provided comprising a communication unit configured to form a network with first and second electronic devices, and a controller configured to control the first electronic device so that the first electronic device plays first content while simultaneously controlling playback of second content when receiving a connection request relating to playback of the second content from the second electronic device while controlling the first electronic device so that the first electronic device plays the first content. | 01-31-2013 |
20130027290 | Input Mode of a Device - A device to detect a directional hand gesture, identify an input mode of the device associated with the directional hand gesture to launch the input mode, modify a user interface rendered on a display component of the device based on the input mode, and modify a setting of the sensor based on whether the input mode includes a virtual keyboard. | 01-31-2013 |
20130027291 | TOUCH PANEL AND OPERATION METHOD THEREOF - The present invention relates to a touch panel. The touch panel includes a substrate, a plurality of first traces, a plurality of second traces, and a plurality of sensing pads. The first traces are disposed on the substrate and are parallel to each other along a first direction. The second traces are disposed on the substrate and are parallel to each other along a second direction. The first traces and the second traces interlace with each other to form a plurality of sensing pad areas. Each sensing pad is disposed in each sensing pad area, and is electrically connected to each first trace and each second trace. The present invention further provides a method of operating the same. In the present invention, the thickness of the touch panel is decreased, and the material for the electrodes is economized. Consequently, the costs of the touch panel can be reduced. | 01-31-2013 |
20130027292 | CONTENT DISPLAY DEVICE - An anchor operation can be facilitated by ten keys using location information on a screen of items to which predetermined tags are attached. Display location information is extracted from the screen of a plurality of items to which predetermined tags described in the content data are attached, the screen on which the content is displayed is partitioned based on the display location information so that the plurality of items exist in one region, and available buttons of the operation device are allocated to the plurality of items of the region and simultaneously an item corresponding to an operated button is selected from the plurality of items. | 01-31-2013 |
20130027293 | STABILISATION METHOD AND COMPUTER SYSTEM - The present invention relates to a method for stabilising a series of measurements of a physical variable captured by a digital sensor. This method comprises the steps of: capturing at least a first measurement, a second measurement, and a third measurement of said physical variable and storing each measurement in a digital memory. The first and second measurements are compared and, if a difference between the first measurement and the second measurement is below a predetermined threshold, the second measurement is replaced in the memory by a corrected second measurement where the difference with respect to said first measurement has been reduced using a first filtering strength. The corrected second measurement and the third measurement are compared and, if a difference between the filtered value of the corrected second measurement and said third measurement is also below the threshold, said the third measurement is replaced by a corrected third measurement where a difference with respect to said corrected second measurement has been reduced using a second filtering strength that is lower than the first filtering strength. This method has the advantage of filtering noise whilst still allowing slow but relevant variations in the series of measurements. | 01-31-2013 |
20130027294 | INPUT APPARATUS, INPUT METHOD, AND CONTROL SYSTEM - There is provided an apparatus including an input apparatus including an input apparatus main body with which input manipulation is performed to manipulate a manipulation target object, a first manipulation detection unit that detects a first manipulation on the input apparatus main body, a second manipulation detection unit that detects a second manipulation on the input apparatus main body after the first manipulation is detected, and a first processing unit that performs first processing for manipulation on the manipulation target object or a first response of the input apparatus, based on a movement detection value corresponding to movement of the input apparatus main body according to the first manipulation or a detection value of the first manipulation. | 01-31-2013 |
20130027295 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM - There is provided an information processing apparatus including an acquisition unit acquiring data on at least one of an acceleration or an angular velocity of a controller operated by a user, and a determination unit determining at least one of a velocity of the controller or a trajectory of the controller based on the acquired data on the at least one of the acceleration or the angular velocity. | 01-31-2013 |
20130027296 | COMPOUND GESTURE-SPEECH COMMANDS - A multimedia entertainment system combines both gestures and voice commands to provide an enhanced control scheme. A user's body position or motion may be recognized as a gesture, and may be used to provide context to recognize user generated sounds, such as speech input. Likewise, speech input may be recognized as a voice command, and may be used to provide context to recognize a body position or motion as a gesture. Weights may be assigned to the inputs to facilitate processing. When a gesture is recognized, a limited set of voice commands associated with the recognized gesture are loaded for use. Further, additional sets of voice commands may be structured in a hierarchical manner such that speaking a voice command from one set of voice commands leads to the system loading a next set of voice commands. | 01-31-2013 |
20130027297 | 3D REMOTE CONTROL SYSTEM EMPLOYING ABSOLUTE AND RELATIVE POSITION DETECTION - The present invention can include three-dimensional remote control systems that can detect an absolute location to which a remote control is pointing in first and second orthogonal axes and an absolute position of the remote control in a third orthogonal axis. Remote control systems of the present invention can employ absolute position detection with relative position detection. Absolute position detection can indicate an initial absolute position of the remote control and relative position detection can indicate changes in the position of the remote control. By combining absolute and relative position detection, remote control systems of the present invention can track remote controls more precisely than systems that only employ absolute position detection. The present invention also can include methods and apparatus for zooming in and out of an image shown on a display based on the absolute position of the remote control in the third axis. | 01-31-2013 |
20130033418 | GESTURE DETECTION USING PROXIMITY OR LIGHT SENSORS - Example methods, apparatuses, or articles of manufacture are disclosed that may be utilized, in whole or in part, to facilitate or support one or more operations or techniques for gesture detection using, at least in part, output or measurement signals from one or more ambient environment sensors, such as, for example, a proximity sensor or ambient light sensor. | 02-07-2013 |
20130033419 | METHOD AND SYSTEMS FOR THREE-DIMENSIONAL IMAGE SEGMENTATION - A method for error correction for segmented three dimensional image data. The method includes receiving segmented three dimensional image data, the segmented three dimensional image data being divided into a plurality of slices; correcting at least one contour of the segmented three dimensional image data on at least one slice according to a command from a user to form a corrected contour; and automatically interpolating a correction represented by the corrected contour to a plurality of slices of the segmented three dimensional image data. | 02-07-2013 |
20130033420 | SLEEVE AND CONTROL DEVICE WITH SUCH SLEEVE - A sleeve of control device is provided for controlling a cursor motion of an electronic device. A touch-feel enhancing mechanism is formed on an outer surface of the sleeve. The touch-feel enhancing mechanism is not related to the function of operating the control device to detect the rotating action or the moving action by the user. The touch-feel enhancing mechanism is only used to enhance comfort and touch feel of operating the control device. | 02-07-2013 |
20130033421 | CHANGEABLE INTERACTIVE SURFACE WITH RESIZING, ZOOMING, MOVING, ROTATING AND OTHER MANIPULATION OPTIONS/CONTROLS - Systems and methods to allow a user of an electronic device to change the size, rotation, zoom and/or position of a drawing surface used with a graphics-based user interface, using characteristic changing controls positioned within the drawing surface, adjacent to the drawing surface or both within and adjacent to the drawing surface. | 02-07-2013 |
20130033422 | ELECTRONIC APPARATUS USING MOTION RECOGNITION AND METHOD FOR CONTROLLING ELECTRONIC APPARATUS THEREOF - An electronic apparatus and controlling method thereof is disclosed. The method for controlling the electronic apparatus includes using motion recognition photographs as an object, and changing and displaying a screen based on a movement direction of the object, when a determination that the photographed object is moved while maintaining a first shape is made. By this method, the user is able to perform zoom in and zoom out operations more easily and intuitively by using motion recognition. | 02-07-2013 |
20130038519 | GRAPHICAL INTERACTIVE VISUAL RESPONSE SYSTEM AND METHOD - A graphical interactive visual response system and method is provided in which a graphical user interface provides such interactivity and visual response. A user can initiate contact with a representative using an application residing on a personal device, such as a mobile telephone or computer. The application (graphical interface) allows the user to interactively select options on a displayed menu, arrive at the appropriate service and initiate a connection with the representative. The connection is established when the user and representative are available, thereby avoiding hold and wait times typically associated with conventional interactive voice response system. | 02-14-2013 |
20130038520 | AUTOMATIC SHUTDOWN OF 3D BASED ON GLASSES ORIENTATION - Devices, systems, and methods are presented for shutting down the 3D effect of active shutter 3D glasses by synchronizing the transparency of the lenses with respect to each other when the 3D glasses have been rotated beyond a threshold angle. The threshold angle can be pre-set through a user-selectable switch. Transitioning from an alternating shutter mode to a synchronized shutter mode can include a fade in which the duty cycles of the lens are adjusted. Direct measurement techniques for measuring the differential roll angle between the lenses and left and right eye images on a display are disclosed. | 02-14-2013 |
20130038521 | SYSTEMS AND METHODS OF CAMERA-BASED FINGERTIP TRACKING - Systems and methods for camera-based fingertip tracking are disclosed. One such method includes identifying at least one location of a fingertip in at least one of the video frames, and mapping the location to a user input based on the location of the fingertip relative to a virtual user input device. | 02-14-2013 |
20130038522 | DISPLAY APPARATUS, DISPLAY METHOD, AND STORAGE MEDIUM - A display apparatus includes an image pickup section, a display section, an instruction section, a position specification section, a direction specification section, a specification section, and a reporting section. The image pickup section sequentially picks up an image. The display section displays the picked up image. The instruction section generates an instruction signal for marking an object included in the image. The position specification section specifies a position where the display apparatus exists. The direction specification section specifies a pickup direction by the image pickup section. The specification section specifies a position of the object relative to the position of the display apparatus based on the position of the display apparatus and the pickup direction in response to the instruction signal. The reporting section reports the position of the object. | 02-14-2013 |
20130038523 | Character Input Device, Character Input Device Control Method, And Information Storage Medium - Methods and apparatus provide for displaying a plurality of character key images in a first display area on the computing device, each character key image representing and being associated with at least one of the character groups, each character group including a plurality of characters; and displaying, in a case where one of the character key images is selected by a user of the computing device, a plurality of input candidate characters in a second display area on the computing device, the plurality of input candidate characters including at least some of the plurality of characters in the character group associated with the selected one of the character key images, the second display area being set in a partially overlapping manner over the first display area. | 02-14-2013 |
20130038524 | IMAGE PICKUP DEVICE AND PROJECTOR - An image pickup device includes an image pickup element which captures an image of a projection surface, an optical filter which has higher transmissivity for infrared light than for visible light, a visible light transmitting member whose transmissivity for visible light is higher than the corresponding transmissivity of the optical filter, and a switching unit which switches between a first condition where the optical filter is disposed on an optical path of light entering the image pickup element, and a second condition where the visible light transmitting member is disposed on the optical path. | 02-14-2013 |
20130038525 | VEHICULAR DISPLAY SYSTEM AND A METHOD FOR CONTROLLING THE DISPLAY SYSTEM - A vehicular control system including a plurality of displays each including a physical display unit and memory module including information related to descriptions of a plurality of displayable entities and first configuration data, associated to a first control module. The first configuration data including information on configuration of the displayable entities, based on references to the descriptions of the plurality of displayable entities. The memory module of each display comprises a copy of the first configuration data. The first control module is arranged to transmit data based on point-to-multipoint communication to each of the plurality of displays. The plurality of displays each includes a processor operatively coupled to the memory module. The processor is arranged to process the entities based on received data from the first control module and to present the result of the processing on the physical display unit of each of the displays. | 02-14-2013 |
20130038526 | DOMESTIC APPLIANCE DEVICE - A household appliance apparatus includes a display screen and an input unit which has at least one substantially transparent input device. In order to achieve advantageous ease of use with little design expenditure, the input device covers only a partial region of the display screen. The input device may hereby be configured as a conductive layer. | 02-14-2013 |
20130038527 | VIDEO DATA PROCESSING APPARATUS AND METHOD - An information processing apparatus ( | 02-14-2013 |
20130038528 | Method and Apparatus for User Interface Communication with an Image Manipulator - A system, and method for use thereof, for image manipulation. The system may generate an original image in a three dimensional coordinate system. A sensing system may sense a user interaction with the image. The sensed user interaction may be correlated with the three dimensional coordinate system. The correlated user interaction may be used to project an updated image, where the updated image may be a distorted version of the original image. The image distortion may be in the form of a twisting, bending, cutting, displacement, or squeezing. The system may be used for interconnecting or communicating between two or more components connected to an interconnection medium (e.g., a bus) within a single computer or digital data processing system. | 02-14-2013 |
20130044049 | ELECTROACTIVE POLYMER TRANSDUCERS FOR TACTILE FEEDBACK DEVICES - Electroactive transducers as well as methods of producing a haptic effect in a user interface device simultaneously with a sound generated by a separately generated audio signal and electroactive polymer transducers for sensory feedback applications in user interface devices are disclosed. | 02-21-2013 |
20130044050 | Causing Display of Comments Associated with an Object - Apparatus is configured to cause to be displayed, in a first area of a display, an object and to cause to be displayed, in a second area of the display, a first comment associated with the object, wherein the second area is in a fixed location relative to the first area. The apparatus is responsive to a first dynamic tactile user input within the second area of the display to cause the first comment to be at least partially hidden and to cause to be displayed, in the second area of the display, a second comment that was not visible prior to the first dynamic tactile user input without moving the object on the display. | 02-21-2013 |
20130044051 | IMAGE DISPLAY DEVICE AND METHOD FOR OPERATING THE SAME - An image display device and a method for operating the same are provided. In the method for operating the image display device which can perform near field communication with a mobile terminal, an image is displayed on a display, device information including motion information or position information of the mobile terminal is received based on the near field communication, and a corresponding menu is displayed on the display or a corresponding operation is performed according to the received motion information or position information. This method can improve user convenience when the image display device is used. | 02-21-2013 |
20130044052 | APPARATUS TO RECOGNIZE A STRAIN IN A FLEXIBLE DISPLAY - An apparatus to recognize a strain in a flexible display includes a recognition unit to include a first panel and a second panel that are formed of an Indium Tin Oxide (ITO) film, which is a transparent conductive film coated with uniform electric constant, and an adhesion layer disposed between the first panel and the second panel, in which the recognition unit is connected to the flexible display and outputs an electric potential value according to the strain in the flexible display; a memory to store an operation pattern information that corresponds to a state of the strain of the flexible display; and a control unit to determine the state of the strain according to the electric potential value, and to execute an operation corresponding to the operation pattern information. | 02-21-2013 |
20130050071 | THREE-DIMENSIONAL IMAGE PROCESSING APPARATUS AND THREE-DIMENSIONAL IMAGE PROCESSING METHOD - In one embodiment, a three-dimensional image processing apparatus includes: an imaging module configured to image a field including a front of a display, the display displays a three dimensional image; and a controller configured to control the display to display an image imaged by the imaging module and a field where the three-dimensional image is recognizable as a three-dimensional body. | 02-28-2013 |
20130050072 | DISPLAY DEVICE, CONTROL METHOD OF DISPLAY DEVICE AND PROGRAM - A display device includes a display unit to display a video screen on a display surface, a information detection unit to detect position information of a plurality of detection points arranged on a stationery tool imitating a shape of a stationery, and identification information added to the plurality of detection points respectively and different among the plurality of detection points, and an angle determination unit to determine a rotation angle of the stationery tool on the display surface based on the position information of the plural detection points. | 02-28-2013 |
20130057465 | IMAGE DISPLAY APPARATUS, REMOTE CONTROLLER, AND METHOD FOR OPERATING THE SAME - A method for operating an image display apparatus is discussed. The method includes displaying an InfraRed (IR) blaster menu screen, receiving a selection input for selecting one of electronic devices included in the IR blaster menu screen, and transmitting IR format key information about the selected electronic device or device information about the selected electronic device to a remote controller according to the selection input. | 03-07-2013 |
20130057466 | PROJECTION SYSTEM, PROJECTION APPARATUS, SENSOR DEVICE, POWER GENERATION CONTROL METHOD, AND COMPUTER PROGRAM PRODUCT - A projection system includes a projector that projects an image onto a projection plane of a screen, and sensor units each including a photoelectric power generating unit that is installed at a predetermined position in an area in which an image is projected on the projection plane and that generates power corresponding to an intensity of projection light projected by the projector. The projector may include an image processing circuit that converts at least image data projected at the installation positions of the sensor units in image data projected on the projection plane into white image data or converts whole image data into white image data. | 03-07-2013 |
20130057467 | COMMUNICATIONS WITH A HAPTIC INTERFACE DEVICE FROM A HOST COMPUTER - The present invention comprises methods and apparatuses that can provide reliable communications between a computer and a haptic interface device. The methods and apparatuses can provide communication that is more secure against errors, failures, or tampering than previous approaches. Haptic devices allow a user to communicate with computer applications using the user's sense of touch, for example by applying and sensing forces with the haptic device. The host computer must be able to communicate with the haptic device in a robust and safe manner. The present invention includes a novel method of accomplishing such communication; a computer-readable medium that, when applied to a computer, causes the computer to communicate according to such a method; and a computer system having a host computer and a haptic device communicating according to such a method. | 03-07-2013 |
20130057468 | DATA OUTPUT DEVICE, DISPLAY DEVICE, DISPLAY METHOD AND REMOTE CONTROL DEVICE - Unit data, which constitutes digital data is extracted as 8-bit units of parallel data by a control section and outputted to a buffer. Thereafter, in a process where the unit data is transmitted from a serial interface to a display unit, the unit data is converted to parallel data in a format required by the display unit. Thereby, it becomes unnecessary for the control section to perform processing of converting the digital data to a format required by the display unit. Consequently, the load on the control section is reduced. | 03-07-2013 |
20130057469 | GESTURE RECOGNITION DEVICE, METHOD, PROGRAM, AND COMPUTER-READABLE MEDIUM UPON WHICH PROGRAM IS STORED - The present invention provides a gesture recognition device which can accurately recognize a user's gesture in a free space with a simple configuration, and which is mounted on a processing unit and which causes the processing unit to execute an operation corresponding to the recognized gesture. The gesture recognition device ( | 03-07-2013 |
20130063336 | VEHICLE USER INTERFACE SYSTEM - A driver can point at a vehicle display using a hand on the steering wheel. The vehicle display may be located in the dashboard behind the steering wheel. The location on the display at which the driver is pointing is determined using sensors, and a cursor is displayed at this location. Finger movement is detected by the sensors and a user interface function is performed in response. The performed user interface functions may include the movement of the displayed cursor on the vehicle display, the display of additional vehicle information, the launching of an application, the interaction with an application, and the scrolling of displayed information. | 03-14-2013 |
20130063337 | APPARATUS AND METHOD FOR PROJECTOR NAVIGATION IN A HANDHELD PROJECTOR - A method and apparatus for navigating a projected image. The method includes projecting, by a projector, an image on a surface, the projected image comprising a first portion of a virtual image. The method also includes determining a movement of the projector, and in response to the movement of the projector, changing the projected image by projecting a second portion of the virtual image different from the first portion. | 03-14-2013 |
20130063338 | DISPLAY WITH SCREEN CAPTURE FUNCTION - A display includes an input interface, a circuit board, a display panel, and an external memory. The input interface is electrically connected to a computer, and receives an image signal from the computer. The image signal includes a number of frames of image data. The circuit board includes a main control chip, an image processing module, and a recall button. The main control chip drives the display panel to display the frames of image data sequentially. If the computer freezes, the main control chip will control the display panel to repeatedly display a final frame of image data. When the recall button is pressed, the image processing module codes/decodes a frame of image data currently displayed on the display panel to obtain image data in a predetermined format which is stored in the external memory. | 03-14-2013 |
20130063339 | APPARATUS FOR SELECTING MULTIMEDIA INFORMATION - An apparatus for selecting multimedia information containing an operating unit that comprises an actuation device that is movable back and forth into and out of a neutral position causing an output signal depending on the shift position of the actuation device, which causes movement of display unit elements in a scrolling manner where the elements are represented one after the other on a display unit in such a manner that the respective forward element covers the subsequent elements only partially, and the respective rearward elements are smaller than the respective forward elements. | 03-14-2013 |
20130063340 | EYE TRACKING CONTROL OF VEHICLE ENTERTAINMENT SYSTEMS - An in-flight entertainment system includes a video display unit facing a user seat. The video display unit includes a display surface that displays images to a user who is seated on the user seat. A light emitter illuminates eyes of the user. A camera outputs a video signal containing reflections from the illuminated eyes. A processor processes the video signal to determine a viewing location on the display surface at which the eyes are directed, and controls at least one function for how images are displayed on the display surface responsive to the determined viewing location. | 03-14-2013 |
20130063341 | DISPLAY DEVICE ADJUSTMENT SYSTEM, AND ADJUSTMENT METHOD - A system controller comprises a normal mode and a maintenance mode as its control modes, and switches the mode so that electronic control buttons are assigned the functions of calling a crew member, onboard communications, switching seat lighting on or off at each seat, etc., in normal mode, and the picture quality and brightness of a large monitor are adjusted in maintenance mode. | 03-14-2013 |
20130063342 | HUMAN INTERFACE INPUT ACCELERATION SYSTEM - A method and system for transmitting data to and from a hand-held host device are disclosed. An accessory device for interfacing with a host device includes a communication channel designed to establish a bidirectional data link between the accessory device and the host device. The accessory device also includes a storage unit communicatively coupled to the communication channel. The storage unit is designed to store various data. In addition, at least a first data is selectively transmitted from the stored data of the accessory device to the host device through the established bidirectional data link. | 03-14-2013 |
20130063343 | SENSOR MAPPING - Techniques, systems and computer program products are disclosed for providing sensor mapping. In one aspect, a method includes receiving input from a user. The received input includes at least one of motion, force and contact. In addition, a sensor signal is generated based on the received input. From a choice of data structures a data structure associated with a selected application having one or more functions is identified. The data structure indicates a relationship between the generated sensor signal and the one or more functions of the selected application. The generated sensor signal is selectively mapped into a control signal for controlling the one or more functions of the selected application by using the identified data structure. | 03-14-2013 |
20130063344 | METHOD AND DEVICE FOR THE REMOTE CONTROL OF TERMINAL UNITS - For the remote control of terminal units of consumer electronics and of computers by way of a remote control with integrated motion sensors, it is suggested to process the detected motion sequences in the remote control itself and interpret them as gestures to the effect that certain gestures correspond to certain commands which are transmitted from the remote control directly to the corresponding terminal unit or to a computer. ( | 03-14-2013 |
20130063345 | GESTURE INPUT DEVICE AND GESTURE INPUT METHOD - A gesture input device includes: a coordinate input detecting unit which sequentially detects coordinate set sequences of a user hand position; a gesture start detecting unit which detects a component indicating a first hand movement for starting a gesture, from a detected first coordinate sequence; a guide image generating unit which generates a gesture guide image for guiding the user to make a gesture including a second hand movement, when the first hand movement component is detected; an intended action component detecting unit which detects a second hand movement component as an intended action component, from a second coordinate sequence detected after the gesture guide image is displayed on the display screen; and a control signal generating unit which detects a component indicating a hand movement corresponding to the gesture from the second coordinate sequence when the intended action component is detected, and generate a control signal according to the detection result. | 03-14-2013 |
20130069858 | Adaptive communications system - This invention allows a system to monitor how quickly and accurately the user is responding via the input device. The input device can be a mouse, a keyboard, their voice, a touch-screen, a tablet PC writing instrument, a light pen or any other commercially available device used to input information from the user to the PBCD. Information is displayed on the PBCD screen based on how quickly and accurately the user is navigating with the input device. | 03-21-2013 |
20130069859 | ELASTIC CONTROL DEVICE AND APPARATUS - An apparatus including at least one processor, and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to receive deformation information from an elastic control device operated by a user. The apparatus is further configured to determine a control signal for the apparatus based on the deformation information, and performing a function associated to the control signal. | 03-21-2013 |
20130069860 | Organizational Tools on a Multi-touch Display Device - A process for enabling objects displayed on a multi-input display device to be grouped together is disclosed that includes defining a target element that enables objects displayed on a multi-input display device to be grouped together through interaction with the target element. Engagement of an input mechanism with one of the target element and a particular one of the objects displayed on the multi-input display device is detected. Movement of the input mechanism is monitored while the input mechanism remains engaged with whichever one of the target element and the particular displayed object that the input mechanism engaged. A determination is made that at least a portion of a particular displayed object is overlapping at least a portion of a target element on the multi-input display device upon detecting disengagement of the input mechanism. As a consequence of disengagement and the overlap, processes are invoked that establish a relationship between the particular displayed object and a position on the target element and that causes transformations applied to the target element also to be applied to the particular displayed object while maintaining the relationship between the particular displayed object and the position on the target element. | 03-21-2013 |
20130069861 | INTERFACE CONTROLLING APPARATUS AND METHOD USING FORCE - An interface controlling apparatus and method using a force may generate content control information by analyzing force input information received from at least one force sensor, and may control content based on the content control information. | 03-21-2013 |
20130069862 | REMOTE MOVEMENT GUIDANCE - In one example of a user-guidance system, a remote control may be configured to guide a user's physical movements by transmitting movement signals that are to be translated into haptic instructions, and cooperative actuators may be configured to be worn by the user to translate the movement signals received from the remote control into the haptic instructions. The movement signals may be translated into the haptic instructions for physical movements of any limb or extremity of the user in either of a vertical direction or a horizontal direction; after the first movement signal, the movement signals may be transmitted prior to completion of an immediately previous physical movement; and the movement signals may include horizontal and vertical directional signal components that indicate the horizontal and vertical direction for the user's next physical movement. The haptic instructions that are translated from the horizontal and vertical directional signal components may differ in either duration or magnitude. The movement signals may include horizontal directional signal components that indicate the horizontal direction for the user's next physical movement, and the haptic instructions that are translated from the horizontal directional signal components may differ in either duration or magnitude. | 03-21-2013 |
20130069863 | TACTILE FEEDBACK APPARATUS, SYSTEM, AND METHOD OF OPERATING TACTILE FEEDBACK APPARATUS - A tactile feedback apparatus, system, and a method of operating the tactile feedback apparatus. The tactile feedback apparatus may detect a finger of a user touching a disk unit, determine a height at which the disk unit is supported, based on a signal generated by a sensor, and support a lower portion of the disk unit by controlling N driving units to be set at the determined height, thereby providing power sensed by the sensor to the finger of the user touching the disk unit. | 03-21-2013 |
20130069864 | DISPLAY APPARATUS, DISPLAY METHOD, AND PROGRAM - There is provided a display apparatus including an observation position detection unit for detecting an observation position of an observer, a generation phase determination portion for determining a generation phase for each viewpoint of a multi-viewpoint image for a plurality of viewpoints depending on the detected observation position, a multi-viewpoint image generation unit for generating a viewpoint image for each viewpoint from a predetermined image depending on the determined generation phase, a display device configured to include a display area having a plurality of pixels arranged thereon, for displaying the viewpoint image for each viewpoint such that the viewpoint image can be observed in each of a plurality of observation areas, and a light beam controller configured to be disposed in front of or behind the display device, for restricting a direction of light beams emitted from the display device or incident on the display device. | 03-21-2013 |
20130069865 | REMOTE DISPLAY - A remote display system including a portable display that wirelessly receives data and power from a primary station. The primary station, which is remote from and without a tangible connection with the portable display, includes a data transmitting element and a power transmitting element. The portable display includes a power receiving element that receives power wirelessly from the power transmitting element and a data receiving element operable to receive data from the data transmitting element. | 03-21-2013 |
20130069866 | SELECTABLE COMMUNICATION INTERFACE CONFIGURATIONS FOR MOTION SENSING DEVICE - Selectable communication interface configurations for motion sensing devices. In one aspect, a module for a motion sensing device includes a motion processor connected to a device component and a first motion sensor, and a multiplexer having first and second positions. Only one of the multiplexer positions is selectable at a time, where the first position selectively couples the first motion sensor and the device component using a first bus, and the second position selectively couples the first motion sensor and the motion processor using a second bus, wherein communication of information over the second bus does not influence a communication bandwidth of the first bus. | 03-21-2013 |
20130069867 | INFORMATION PROCESSING APPARATUS AND METHOD AND PROGRAM - An apparatus and method provide logic for providing gestural control. In one implementation, an apparatus includes a receiving unit configured to receive a first spatial position associated with a first portion of a human body, and a second spatial position associated with a second portion of the human body. An identification unit is configured to identify a group of objects based on at least the first spatial position, and a selection unit is configured to select an object of the identified group based on the second spatial position. | 03-21-2013 |
20130076612 | ELECTRONIC DEVICE WITH WRAP AROUND DISPLAY - A consumer electronic product includes at least a transparent housing and a flexible display assembly enclosed within the transparent housing. In the described embodiment, the flexible display assembly is configured to present visual content at any portion of the transparent housing. | 03-28-2013 |
20130076613 | POWERED MARKING APPARATUS FOR POINTING CONTROL - Certain aspects relate to a cordless powered light marking apparatus that can be used with a pointing device and information processor to enable a program to be executed by the information processor. In certain embodiments, the cordless light marking apparatuses are “solar-powered.” In some cases, the cordless light marking apparatus may be self-powered in a different manner, such as via rechargeable batteries (e.g., lithium ion), alkaline batteries, etc. Other aspects relate to applications of the cordless light marking apparatuses. For example, multiple display devices connected directly or indirectly to one information processor may each have a light marking apparatus, enabling multiple players to interact substantially simultaneously with the same information processor through different display devices and different pointing devices. Further, the information processor may be accessed at different times through different display devices, without needing to move the processor or light marking apparatus. | 03-28-2013 |
20130076614 | ACCESSORY DEVICE - Accurate and reliable techniques for determining information of an accessory device in relation to an electronic device are described. | 03-28-2013 |
20130076615 | INTERFACE METHOD AND APPARATUS FOR INPUTTING INFORMATION WITH AIR FINGER GESTURE - An interface apparatus and method is disclosed for inputting information with a user's finger gesture while the user is driving with his two hands on a steering wheel. In one aspect, the interface apparatus includes a gesture sensor, a gesture processor, and a head-up display (HUD), where the gesture sensor and the gesture processor recognizes and interpret the information input by the user's finger gesture and such information is displayed on the HUD. In one embodiment, a plurality of point of interest (POI) icons are displayed on the HUD after the user inputs at least one letter into the system and the user can select the POI by his/her finger gesture. In another embodiment, the gesture sensor can recognize the user's finger gesture in non-alphabet characters. | 03-28-2013 |
20130076616 | ADAPTIVE TRACKING SYSTEM FOR SPATIAL INPUT DEVICES - An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace. | 03-28-2013 |
20130076617 | ADAPTIVE TRACKING SYSTEM FOR SPATIAL INPUT DEVICES - An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace. | 03-28-2013 |
20130076618 | COMPUTER-READABLE STORAGE MEDIUM HAVING STORED THEREIN DISPLAY CONTROL PROGRAM, DISPLAY CONTROL SYSTEM, DISPLAY CONTROL APPARATUS, AND DISPLAY CONTROL METHOD - An exemplary information processing apparatus selectively switches between: first control where control is performed such that, in a virtual space, a position of producing no parallax on a screen of a stereoscopic display is a first position near a predetermined object; and second control where control is performed such that the position of producing no parallax is closer to a viewpoint position of virtual cameras than the first position is. | 03-28-2013 |
20130076619 | Methods and Apparatus for Freeform Deformation of 3-D Models - Methods and apparatus for interactive curve-based freeform deformation of three-dimensional (3-D) models may provide a user interface that allows a user to interactively deform 3-D models based on simple and intuitive manipulations of a curve drawn on the model (i.e., freeform deformation). The user may apply freeform deformations using touch and/or multitouch gestures to specify and manipulate a deformation curve. The deformations may be applied by deforming the space around a curve/sweep path and deforming the 3-D model accordingly. The freeform deformation methods are not dependent on manipulation of a fixed set of parameters to perform deformations, and may provide for both local and global deformation. One or more weights and user interface elements for controlling those weights may be provided that allow the user to control the extent (region of influence) of the freeform deformations along the curve and/or perpendicular to the curve. | 03-28-2013 |
20130076620 | PROJECTION APPARATUS, PROJECTION CONTROL METHOD AND STORAGE MEDIUM STORING PROGRAM - A projector apparatus includes a light source, a projection unit to form an optical image by using the light source and to project the formed image, an input unit to input a video signal representing the image, a person detection unit to detect a person existing in a projection area, a first projection control unit to set the projector apparatus to a projection state using a preset illumination pattern, when a person has been detected in the projection area, a motion analyzing unit to analyze a person's motion in the projection area, and a second projection control unit to activate the motion analyzing unit, after setting the projector apparatus to the projection state using the preset illumination pattern. | 03-28-2013 |
20130076621 | DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A display apparatus includes: an image receiving unit which receives a content; a display unit which displays the received content; an image pickup unit which captures images of a user; a storage unit which stores the content and at least one of the captured images of the user; and a control unit which displays a portion of the content with the at least one captured image from among the captured images of the user. | 03-28-2013 |
20130076622 | METHOD AND APPARATUS FOR DETERMINING INPUT - An apparatus, comprising a processor, a memory including computer program code, the memory and the computer program code configured to, working with the processor, cause the apparatus to perform at least the following: receiving a first image, recognizing at least part of the first image as a command receiver, recognizing at least part of the first image as an input article, determining that at least part of the input article is associated with at least part of the command receiver, and causing display of a guidance associated with the command receiver is disclosed. | 03-28-2013 |
20130082916 | METHODS, APPARATUSES, AND COMPUTER PROGRAM PRODUCTS FOR IMPROVING DEVICE BEHAVIOR BASED ON USER INTERACTION - Methods, apparatuses, and computer program products are herein provided for improving operation of a device based upon user interaction. A method may include receiving user input. The method may further include determining a user state value based at least in part on the received user input, wherein the user state value corresponds to a patience level of the user with the current rate of operation of the device. The method may further include causing modification in the operation of the device based at least in part on comparison of the user state value to a threshold user state value. Corresponding apparatuses and computer program products are also provided. | 04-04-2013 |
20130082917 | APPARATUS CONFIGURED TO HAVE MULTIPLE USER INTERFACES AND METHOD - An apparatus comprising: a controller; a first part; a second part; an arrangement configured to add a third part between the first part and the second part to change a configuration of the apparatus from a first configuration in which the first part and the second part are adjacent and there is no third part between the first part and the second part, to a second configuration in which the first part and the second part are separated by the third part and the first part and the third part are adjacent, wherein the controller is configured to pair the first part and the second part as a user interface when the apparatus is in the first configuration, and is configured to pair the first part and the third part as a user interface when the apparatus is in the second configuration. | 04-04-2013 |
20130082918 | METHOD AND APPARATUS PERTAINING TO RESPONSIVELY CHANGING APPLICATION FUNCTIONALITY OF AN ELECTRONIC DEVICE - Detection of a change in physical configuration of a discrete device with respect to an electronic device leads to responsively changing application functionality of the electronic device as a function, at least in part, of information provided by the discrete device. By one approach the detected change can pertain to movement of the discrete device with respect to the electronic device or orientation of a coupling of the discrete device to the electronic device. This information can comprise at least a unique identification code. In such a case these teachings will accommodate using the unique identification code to access a corresponding profile for the discrete device where the profile specifies at least one application to be presently made available using the discrete device. | 04-04-2013 |
20130082919 | METHOD AND APPARATUS PERTAINING TO AUTOMATED FUNCTIONALITY BASED UPON DETECTED INTERACTION BETWEEN DEVICES - Detection of a physical interaction between a first device that is logically coupled to a second device, wherein the physical interaction comprises one of a plurality of physical interactions that involve movement of one of the first device and the second device, results in automatically performing a function that corresponds to the physical interaction. The detected physical interaction can comprise one or more of a physical reorientation of the first device, a pivoting movement between the first device and the second device, a sliding movement between the first device and the second device, and a momentary change in physical proximity of the first device with respect to the second device. This activity can also include determining whether the physical interaction occurs within a predetermined period of time, and when the physical interaction does not occur within the predetermined period of time, prohibiting the automatic performing of the function. | 04-04-2013 |
20130082920 | CONTENT-DRIVEN INPUT APPARATUS AND METHOD FOR CONTROLLING ELECTRONIC DEVICES - A content-driven apparatus for controlling electronic devices integrates all control command functions for at least one controlled electronic device. Content information transmitted from one of the controlled electronic devices is received by a communication module, and is passed to a processing element for parsing the content information, including types of the content information, desired command actions to be proceeded, controlled electronic devices required to cooperate, and how to operate for a user. The processing element decides a user interface and an operation method for the user after the parsing, and issues corresponding control messages to the controlled electronic devices required to cooperate after the user uses the operation method to select specific control commands. | 04-04-2013 |
20130082921 | BARRIER PANEL, AND 3D IMAGE DISPLAY DEVICE AND METHOD USING THE SAME - A barrier panel, and a 3D image display device and method are provided. The barrier panel is disposed corresponding to an image panel including a plurality of pixels and is configured such that M×N barrier regions of the barrier panel are formed at each pixel of the image panel. The brightness of each barrier region is adjusted to display a 3D image from the image panel. | 04-04-2013 |
20130082922 | TACTILE GLOVE FOR HUMAN-COMPUTER INTERACTION - One embodiment is directed to a system for human-computer interface, comprising an input device configured to provide two or more dimensions of operational input to a processor based at least in part upon a rubbing contact pattern between two or more digits of the same human hand that is interpreted by the input device. The input device may be configured to provide two orthogonal dimensions of operational input pertinent to a three-dimensional virtual environment presented, at least in part, by the processor. The input device may be configured to detect the rubbing between a specific digit in a pen-like function against one or more other digits. The input device further may be configured to detect the rubbing between the specific digit in a pen-like function and one or more other digits in a receiving panel function. | 04-04-2013 |
20130088419 | DEVICE AND CONTROL METHOD THEREOF - A device and a control method for the device are disclosed. A device and a control method for the device according to the present invention comprises a sensing unit; and a controller, if at least one of a second control command is received through the sensing unit while carrying out a control operation based on at least one of a first control command received through the sensing unit, generating a display signal based on a control command selected according to a predetermined criterion from the received multiple control commands of the first and the second control command. According to the present invention, a control command for generating a display signal can be effectively selected in the case that another control command is received while a particular control command is carried out. | 04-11-2013 |
20130088420 | METHOD AND APPARATUS FOR DISPLAYING IMAGE BASED ON USER LOCATION - A method and an apparatus for displaying an image based on a user location allowing a user to view a displayed image similar to original image although viewing it from any location are provided. The method includes receiving a location value indicating a location of a user while displaying an original image, transforming the original image based on the received location value, and displaying the transformed image. | 04-11-2013 |
20130088421 | DISPLAY CONTROL DEVICE - Provided is a display control device including a communication unit that performs communication with an external device, a detecting unit that detects a direction of an operation device based on detection information representing motion of the operation device, and a display control unit that generates an imaging control command used to control an imaging operation based on the detected direction of the operation device, and controls display of an image obtained through an imaging operation by causing the communication unit to transmit the imaging control command to the external device. | 04-11-2013 |
20130088422 | INPUT APPARATUS AND INPUT RECOGNITION METHOD - Provided is an input apparatus, including: an infrared camera; an image capture unit configured to sequentially capture a plurality of temperature distribution images photographed at predetermined time intervals by the infrared camera; and an input recognition unit configured to detect, from among the plurality of temperature distribution images captured by the image capture unit, pairs of skin temperature image portions each corresponding to a temperature of skin of a person, recognize, from among the pairs of skin temperature image portions thus detected, pairs of skin temperature image portions as pairs of detection target images, from among which motions are observed, and recognize an operation input based on states of the motions of the pairs of detection target images. | 04-11-2013 |
20130088423 | KEY INPUT APPARATUS FOR PORTABLE TERMINAL - A key input apparatus and a portable terminal having the key input apparatus are provided. The key input apparatus includes a main board, a display having flexibility and elasticity and at least one pressing switch placed between the main board and the display. | 04-11-2013 |
20130088424 | DEVICE AND METHOD FOR PROCESSING VIRTUAL WORLDS - A device and method for processing virtual worlds. According to embodiments of the present disclosure, information which is measured from the real world using characteristics of a sensor is transferred to a virtual world, to thereby implement an interaction between the real world and the virtual world. The disclosed device and method for processing virtual worlds involve selectively transferring information, from among the measured information, which is different from previously measured information. The disclosed device and method for processing virtual worlds involve transferring the entire measured information in the event that the measured information is significantly different from the previously measured information and for selectively transferring information, from among the measured information, which is different from the measured information in the event that the difference is not significant. | 04-11-2013 |
20130088425 | APPARATUS AND METHOD OF DETECTING AN INPUT POSITION WITH DISPLAY PATTERN RECOGNITION - An apparatus and method are provided for detecting an input position using display pattern recognition. The apparatus includes an effective pattern area extractor for receiving an image of a display screen captured by a camera and extracting an effective pattern area for pattern recognition from the captured image of the display screen; a pattern recognizer for detecting subpixels included in the effective pattern area and identifying a plurality of holes included in each of the subpixels; and a display coordinate calculator for detecting an input position based on points at which the plurality of holes included in the each of the subpixels are formed. | 04-11-2013 |
20130088426 | GESTURE RECOGNITION DEVICE, GESTURE RECOGNITION METHOD, AND PROGRAM - There is provided a gesture recognition device that recognizes a gesture of shielding the front side of the imaging sensor, the gesture recognition device including a first detection unit that detects a change in a captured image between a state in which a front side of an imaging sensor is not shielded and a state in which the front side of the imaging sensor is shielded, and a second detection unit that detects a region in which a gradient of a luminance value of the captured image is less than a threshold value in the captured image in the state in which the front side of the imaging sensor is shielded. | 04-11-2013 |
20130093659 | AUTOMATIC ADJUSTMENT LOGICAL POSITIONS OF MULTIPLE SCREEN - A method and a computer system are provided for automatically setting the logical positions of multiple screen displays. A computer system may comprise a plurality of display devices, at least one image capturing device, and a controller. The controller may be coupled to the display devices and image capturing devices. The adjustment module may be adapted to adjust the plurality of the display settings. | 04-18-2013 |
20130093660 | METHOD AND SYSTEM TO CONTROL A PROCESS WITH BEND MOVEMENTS - Various embodiments include devices, methods, circuits, data structures, and software that allow for control of a process through detecting of movements in bends in a flexible display device. In an example, a method for controlling a process through bend movements can include detecting movement of a bend in a deformable display panel and modifying a process running on a computing device response to detecting the movement of the bend. | 04-18-2013 |
20130093661 | METHODS AND APPARATUS FOR FACILITATING USER INTERACTION WITH A SEE-THROUGH DISPLAY - Methods and apparatus are provided in order to facilitate user interaction with an electronic device, such as a see-through display. In the context of a method, a reference plane may be determined based upon a fiducial marker presented upon a display of a mobile terminal. The method may also cause interaction information to be displayed in relation to the reference plane such that the interaction information appears to be presented upon the display of the mobile terminal and such that the interaction information at least partially occludes a user's view of the fiducial marker presented upon the display of the mobile terminal. The method may also include receiving input responsive to user input to the mobile terminal in relation to the interaction information and causing performance of an operation in response to the input based upon the interaction information. | 04-18-2013 |
20130093662 | SYSTEM AND METHOD OF MODE-SWITCHING FOR A COMPUTING DEVICE - A first device such as a portable computing device can be configured to act as a text-entry device (in a text-entry mode) and a cursor control device (in a cursor control mode) for a second device. The first device can include a touch-sensitive display capable of receiving text inputs and cursor inputs for controlling the display of a second device which is communicatively coupled to the first device. The first device can be configured such that selection of certain items displayed by the second device can cause the first device to switch from a text-entry mode to a cursor control mode. The first device can be configured such that rotation of the device between a landscape orientation and a portrait orientation causes the device to switch between modes. The first device can be configured such that sideways movement of the device causes the device to switch between modes. | 04-18-2013 |
20130093663 | LIGHT DEFLECTOR AND LIQUID CRYSTAL DISPLAY DEVICE USING THE SAME - A light deflector capable of deflecting light in a predetermined deflection direction and modulating the angle of deflection of light includes a plurality of liquid crystal deflection elements arranged in the predetermined deflection direction. In at least one pair of adjacent liquid crystal deflection elements, the dimension of one of the liquid crystal deflection elements in the predetermined deflection direction is different from the dimension of the other liquid crystal deflection element in the predetermined deflection direction. | 04-18-2013 |
20130093664 | DRAWING DEVICE, DRAWING CONTROL METHOD, AND DRAWING CONTROL PROGRAM FOR DRAWING GRAPHICS IN ACCORDANCE WITH INPUT THROUGH INPUT DEVICE THAT ALLOWS FOR INPUT AT MULTIPLE POINTS - A game device includes: an input position acquiring unit for acquiring the position of an input entry to a touch panel capable of concurrently detecting input entries at a plurality of points; a drawing start determination unit for determining, when the respective positions of concurrent input entries to the touch panel at two points are acquired, to start drawing a graphic if a distance between the two points is within a first range; a drawing unit for drawing, while the input entries at the two points continue to be input after drawing a graphic is started, a graphic calculated based on the respective movement trajectories of the two points and displaying the graphic on a display device; and a drawing end determination unit for determining to end drawing the graphic when the input of the input entries at the two points ends. | 04-18-2013 |
20130093665 | METHOD AND APPARATUS FOR PROCESSING VIRTUAL WORLD - A virtual world processing apparatus and method. An angle value is obtained by measuring an angle of a body part of a user of a real world using sensor capability, which is information on capability of a bending sensor, and is transmitted to a virtual world, thereby achieving interaction between the real world and the virtual world. In addition, based on the sensor capability and the angle value denoting the angle of the body part, control information is generated to control a part of an avatar of the virtual world, corresponding to the body part, and then transmitted to the virtual world. Accordingly, interaction between the real world and the virtual world is achieved. | 04-18-2013 |
20130093666 | PROJECTOR AND IMAGE DRAWING METHOD - A projector includes a recognition section adapted to recognize an image to be drawn and a drawing area in which the image is drawn based on a trajectory of a symbol written to a projection surface, an acquisition section adapted to acquire image data representing the image recognized by the recognition section, and a drawing control section adapted to perform control so that the image represented by the image data acquired by the acquisition section is drawn in the drawing area of the projection surface recognized by the recognition section. | 04-18-2013 |
20130093667 | METHODS AND DEVICES FOR MANAGING VIEWS DISPLAYED ON AN ELECTRONIC DEVICE - A mobile device is disclosed. The mobile device may include a display device configured to generate a first graphical configuration at a first orientation of the mobile device and a second graphical configuration at a second orientation of the mobile device, at least one input device, and an input linking module configured to selectively link the at least one input device to either the first or second graphical configuration based on an orientation of the mobile device. | 04-18-2013 |
20130093668 | METHODS AND APPARATUS FOR TRANSMITTING/RECEIVING CALLIGRAPHED WRITING MESSAGE - Methods and apparatus are provided for transmitting and receiving a calligraphed writing message. Writing data is receiving as input at a transmitting apparatus. The writing data is sampled to generate character frame data having a plurality of point data. Calligraphy control point data is generated using the character frame data. The calligraphed writing message including the calligraphy control point data is generated and transmitted to a receiving apparatus. A calligraphy outline is generated at the receiving apparatus for generation of a calligraphed writing image using the calligraphy control point data from the calligraphed writing message. Graphic processing is performed on the calligraphy outline. The calligraphed writing image is generated and displayed. | 04-18-2013 |
20130093669 | APPARATUS AND METHOD FOR CONTROLLING SCREEN IN WIRELESS TERMINAL - An apparatus and method for controlling a screen in a wireless terminal including a bendable display unit for easily controlling the screen, the apparatus including a bendable display unit, a bending detector for detecting bending of the bendable display unit, and a controller for controlling the bendable display unit to up-scale and display data in a non-bent screen region on the bendable display unit when the bendable display unit is bent. | 04-18-2013 |
20130093670 | IMAGE GENERATION DEVICE, METHOD, AND INTEGRATED CIRCUIT - When a television screen is split into sub-screens and the sub-screens are allocated to a plurality of operators, a television appropriately controls display positions and sizes of the sub-screens according to a position relationship between the operators, distances between the operators and the sub-screens, and rearrangement of the positions of the operators. Specifically, the television includes an external information obtaining unit that obtains position information items of the gesture operations and a generation unit that generates an image in a layout set based on a relative position relationship between the position information items. | 04-18-2013 |
20130100008 | Haptic Response Module - Embodiments provide an apparatus that includes a tracking sensor to track movement of a hand behind a display, such that a virtual object may be output via a display, and a haptic response module to output a stream of gas based a determination that the virtual object has interacted with a portion of the image. | 04-25-2013 |
20130100009 | MULTI-USER INTERACTION WITH HANDHELD PROJECTORS - A device-mounted infrared (IR) camera and a hybrid visible/IR light projector may be used to project visible animation sequences along with invisible (to the naked eye) tracking signals, (e.g. a near IR fiducial marker). A software (or firmware) platform (either on the handheld device or to which the device communicates) may track multiple, independent projected images in relation to one another using the tracking signals when projected in the near-IR spectrum. The resulting system allows a broad range of new interaction scenarios for multiple handheld projectors being used in any environment with adequate lighting conditions without requiring the environment to be instrumented in any way. | 04-25-2013 |
20130100010 | Control Method and System of Brain Computer Interface with Stepping Delay Flickering Sequence - A control method of a brain computer interface (BCI) with a stepping delay flickering sequence is provided. First, a plurality of different flickering sequence signals are generated by encoding a static flashing segment and a plurality of stepping delay flashing segments divided in different time sequences. Then, a plurality of target images corresponding to the flickering sequence signals are displayed. Thereafter, a response signal generated by an organism evoked by the target images is acquired. Then, signal processing is performed on the response signal by using a mathematic method to distinguish which one of the target images is gazed by the organism. Thereafter, a controlling command corresponding to one of the target images is generated. A control system of a BCI with a stepping delay flickering sequence is also provided herein. | 04-25-2013 |
20130100011 | HUMAN-MACHINE INTERFACE DEVICE - A human-machine interface device suitable for being electrically connected to an electronic device. The human-machine interface device includes a flexible carrier having at least one flexible portion, a bending sensor, and a control module. The bending sensor is disposed on the flexible portion of the flexible carrier. The control module is disposed on the carrier, connected to the bending sensor, and electrically connected to the electronic device. A first operation signal from the bending sensor is transmitted to the electronic device through the control module so that the electronic device performs according to the first operation signal. | 04-25-2013 |
20130100012 | DISPLAY WITH DYNAMICALLY ADJUSTABLE DISPLAY MODE - A display mode, including an update rate and/or resolution of an electronic display device, is dynamically adjusted by computing a change rate metric (CRM) representative of an image change rate, and adjusting, based on the CRM, at least one of the update rate and the image resolution. The display device may be switched between at least two display modes, based on the CRM, which may be computed by comparing each pixel row in a first frame with a corresponding pixel row in a comparison frame, and determining a number of pixel rows that have changed. A cyclic redundancy check (CRC) value for each pixel row may be computed, and the CRM may then be computed by determining a number of pixel rows that have a changed CRC value with respect to a corresponding pixel row of the comparison frame. | 04-25-2013 |
20130100013 | PORTABLE TERMINAL AND METHOD OF SHARING A COMPONENT THEREOF - A portable terminal capable of sharing at least a portion of a component, and a method therefor, are provided. The portable terminal includes an interface unit configured to connect with another portable terminal and a controller configured to activate a sharing mode in which at least a portion of a component is shared with the other portable terminal when connected with the other portable terminal through the interface unit. | 04-25-2013 |
20130100014 | DEVICE FOR HANDLING BANKNOTES - The invention relates to a device ( | 04-25-2013 |
20130100015 | OPTICAL INPUT DEVICES - A system may be provided that includes computing equipment and an optical input accessory. The computing equipment may include an imaging system, storage and processing circuitry, a display, communications circuitry, and input-output devices. The optical input accessory may include one or more light sources and optical markers representing instrument components for multiple instruments. The optical markers may be printed, molded or otherwise formed on a surface of a housing structure. The light sources may each emit a particular type of light. The computing equipment may use the imaging system to track the relative locations of the light sources and to continuously capture images of the optical markers and of user input objects. The computing equipment may generate audio, video or other output based on user input data generated in response to detected changes in position of the user input object with respect to the optical markers in the images. | 04-25-2013 |
20130106682 | Context-sensitive query enrichment | 05-02-2013 |
20130106683 | Context-sensitive query enrichment | 05-02-2013 |
20130106684 | Wearable Device Assembly Having Athletic Functionality | 05-02-2013 |
20130106685 | Context-sensitive query enrichment | 05-02-2013 |
20130106686 | GESTURE PROCESSING FRAMEWORK | 05-02-2013 |
20130106687 | ELECTRONIC COMMUNICATION BETWEEN A GAMEPAD AND AN ELECTRONIC DEVICE | 05-02-2013 |
20130106688 | IMAGE INFORMATION PROCESSING APPARATUS | 05-02-2013 |
20130106689 | METHODS OF OPERATING SYSTEMS HAVING OPTICAL INPUT DEVICES | 05-02-2013 |
20130106690 | Dispensing System and User Interface | 05-02-2013 |
20130106691 | HAPTIC FEEDBACK SENSATIONS BASED ON AUDIO OUTPUT FROM COMPUTER DEVICES | 05-02-2013 |
20130106692 | Adaptive Projector | 05-02-2013 |
20130113696 | USER INTERFACE DISPLAYING FILTERED INFORMATION - A trigger event is set, based on information in an information flow. One or more actions are set to occur in response to occurrence of the trigger event. The information flow is received. The set one or more actions are performed upon occurrence of the trigger event, and a user interface is displayed based on the information flow. The actions include, but are not limited to, filtering display of information from the information flow in response to occurrence of the trigger event. | 05-09-2013 |
20130113697 | DISPLAY CONTROL DEVICE - A display control device is disclosed for at least a computer system to control one or more display devices. The display control device includes a processing unit, a memory unit, a connection module, and a video control unit. The memory unit records the display control data generated by the processing unit. The connection module is for connecting the display devices, and for transmitting the video data generated by the computer system when reading the display control data to the display devices. The video control unit is coupled to the computer system, the connection module, and the processing unit. The video control unit is controlled by the processing unit for selecting the display device connected with the connection module, and outputs the video data generated by the computer system to the selected display device through the connection module. Therefore, the computer system may communicate to display devices having different standards. | 05-09-2013 |
20130113698 | STORAGE MEDIUM, INPUT TERMINAL DEVICE, CONTROL SYSTEM, AND CONTROL METHOD - An input terminal device of one example includes a CPU, and the CPU transmits operation data to a game apparatus executing game processing in response to an operation by a user. Furthermore, when a TV remote control button provided to the input terminal device is operated, the CPU sets a TV control mode capable of operating a television, to thereby display a TV remote controller image on an LCD of the input terminal device. By using this image, the user inputs operation data of the television. | 05-09-2013 |
20130113699 | APPARATUS AND METHOD FOR CONTROLLING CONTROLLABLE DEVICE IN PORTABLE TERMINAL - An apparatus and method control a controllable device in a portable terminal. The apparatus includes a device register unit, an input unit, a memory unit, a gesture detector, a controller, and a contents transmitter. The device register unit registers controllable devices to partial regions of an output screen. The input unit generates input data for controllable device registration or user's input data for controllable device selection. The memory unit stores information of the controllable devices registered by the device register unit. The gesture detector senses a user's input capable of determining direction. The controller detects a controllable device corresponding to the user's input. The contents transmitter transmits contents to the controllable device detected by the controller. | 05-09-2013 |
20130113700 | OPERATING AND MONITORING SCREEN DISPLAYING APPARATUS AND METHOD - An operating and monitoring screen displaying apparatus according to the present disclosure selects a screen among a plurality of split display screens displaying information about a part of a process in a plant and displays the selected screen as an operating and monitoring screen. The apparatus includes a receipt unit which receives a selection of a screen to be displayed among a plurality of split display screens; a split screen display unit which displays the screen selected through the receipt unit; and a map display unit which displays a map in which process elements constituting the process are associated with display areas, and indicates, on the map, a display area associated with a process element included in the screen being displayed by the split screen display unit. | 05-09-2013 |
20130113701 | IMAGE GENERATION DEVICE - An image generation device | 05-09-2013 |
20130113702 | MOBILE TERMINAL AND OPERATION CONTROL METHOD - There is provided a mobile terminal making it possible to normally use applications developed for other mobile terminals performing different screen display mode switching control. In application section | 05-09-2013 |
20130120237 | Systems and Methods for Synthesizing High Fidelity Stroke Data for Lower Dimension Input Strokes - Systems and methods for synthesizing paintbrush strokes may use high fidelity pose data of reference strokes to supplement lower dimension stroke data. For example, 6DOF data representing reference strokes created by skilled artists may be captured and stored in a library of reference strokes. A query stroke made by a less skilled artist/user and/or made using input devices that do not provide 6DOF data may be processed by a stroke synthesis engine to produce an output stroke that follows the trajectory of the query stroke and includes pose data from one or more reference strokes. The stroke synthesis engine may construct feature vectors for samples of reference strokes and query strokes, select the best neighbor feature vector from the library for each query stroke sample, assign the pose data of the best neighbor to the query sample, and smooth the sequence of assigned poses to produce the synthesized output stroke. | 05-16-2013 |
20130120238 | LIGHT CONTROL METHOD AND LIGHTING DEVICE USING THE SAME - There is herein described a light output control method for a controlling a lighting device by a motion of an object near an environment, the lighting device comprising a video sensor and a light-emitting unit, the light output control method comprising steps of emitting an infrared light onto at least a part of the object and at least a part of the environment, collecting the infrared light reflected by at least the part of the object and at least the part the environment as a two-dimensional depth data sequence of the video sensor, computing the motion of the object by utilizing the two-dimensional depth data sequence, and controlling the light-emitting unit to change an attribute of the output light if the motion of the object complies with a predetermined condition. | 05-16-2013 |
20130120239 | INFORMATION PROCESSING DEVICE, CONTROL METHOD, AND PROGRAM - There is provided an information processing device including a display screen having flexibility; a deflection detection unit configured to detect deflection of the display screen; and a control unit configured to recognize a change in the deflection detected by the deflection detection unit as an on operation input and output a corresponding process command. | 05-16-2013 |
20130120240 | APPARATUS AND METHOD FOR CONTROLLING IMAGE DISPLAY DEPENDING ON MOVEMENT OF TERMINAL - A method for controlling image display based on movement of a terminal is provided. The method includes displaying any one image in an image list, when an image view feature runs; detecting movement of the terminal; and sequentially replacing and outputting images in the image list based on at least one of a moving direction and a moving angle of the terminal, upon detecting the movement of the terminal. | 05-16-2013 |
20130120241 | CONTROL DEVICE FOR PROVIDING A RECONFIGURABLE OPERATOR INTERFACE - A control device for providing a reconfigurable operator interface for a medical apparatus is provided. The control device comprises a control surface comprising at least one control actuator arranged on the control surface. The control device further comprises a plurality of modules arranged side-by-side and attached to each other via adjacent abutting side surfaces in a liquid-tight manner, wherein the plurality of modules comprise a first terminal module having a closing end surface and an abutting side surface opposite the closing end surface, and a second terminal module having an abutting side surface facing towards the abutting side surface of the first terminal module, and an opposite closing end surface. Operation of the control device is enabled when all the abutting side surfaces of the plurality of modules are in an attached state. | 05-16-2013 |
20130120242 | OPTICAL POINTING DEVICE AND ELECTRONIC DEVICE INCLUDING SAME - An optical pointing device is configured so that: an optical unit and a lens unit are integrally molded and form a portion of an optical cover; each of a light transmitting resin layer and the optical cover is made of a resin containing thermosetting resin as a main component; and a light shielding resin layer is made of a resin containing, as a main component, thermosetting resin and/or thermoplastic resin that has heat resistance. Thereby, the present invention provides an optical pointing device whose optical characteristics, reliability and heat resistance are excellent and whose number of component members are reduced. | 05-16-2013 |
20130120243 | DISPLAY APPARATUS AND CONTROL METHOD THEREOF - Disclosed are a display apparatus and a control method thereof, the display apparatus including: an image acquirer which acquires an image of a plurality of users; a display which displays the image acquired by the image acquirer; and a controller which selects a user making a predetermined gesture among the plurality of users in the image and controls the display apparatus to perform an operation corresponding to the selected user out of operations which are capable of being performed by the display apparatus when the image of the plurality of users is acquired through the image acquirer and the predetermined gesture is recognized from the acquired image. | 05-16-2013 |
20130120244 | Hand-Location Post-Process Refinement In A Tracking System - A tracking system having a depth camera tracks a user's body in a physical space and derives a model of the body, including an initial estimate of a hand position. Temporal smoothing is performed in which some latency is imposed when the initial estimate moves by less than a threshold level from frame to frame, while little or no latency is imposed when the movement is more than the threshold. The smoothed estimate is used to define a local volume for searching for a hand extremity to define a new hand position. Another process generates stabilized upper body points that can be used as reliable reference positions, such as by detecting and accounting for occlusions. The upper body points and a prior estimated hand position are used to define an arm vector. A search is made along the vector to detect a hand extremity to define a new hand position. | 05-16-2013 |
20130120245 | User Interface Devices - A method and apparatus of user interface (“UI”) having multiple motion dots capable of detecting user inputs are disclosed. In one embodiment, a digital processing system includes a first motion dot, a second motion dot, and a device. The first motion dot can be attached to a first location of a user's body and the second motion dot may be attached to the second location of the user's body. The first motion dot, for example, includes accelerometers able to identify a physical location of the first motion dot and the second motion dot also includes accelerometers capable of detecting an input generated based on relative physical position between the first motion dot and the second motion dot. The device, which is logically coupled to the second motion via a wireless connection, is configured to store the input in a local storage. | 05-16-2013 |
20130120246 | METHOD AND APPARATUS FOR USING BIOPOTENTIALS FOR SIMULTANEOUS MULTIPLE CONTROL FUNCTIONS IN COMPUTER SYSTEMS - A biosignal-computer-interface apparatus and method. The apparatus includes one or more devices for generating biosignals based on at least one physiological parameter of an individual, and a computer-interface device capable of performing multiple tasks, including converting the biosignals into at least one input signal, establishing a scale encompassing different levels of the input signal, multiplying the input signal into parallel control channels, dividing the scale into multiple zones for each of the parallel control channels, assigning computer commands to each individual zone of the multiple zones, and generating the computer command assigned to at least one of the individual zones if the level of the input signal is within the at least one individual zone. The individual zones can be the same or different among the parallel control channels. | 05-16-2013 |
20130120247 | THREE DIMENSIONAL DISPLAY DEVICE AND THREE DIMENSIONAL DISPLAY METHOD - Provided is a three dimensional display device that includes: three dimensional image display unit ( | 05-16-2013 |
20130127703 | Methods and Apparatus for Modifying Typographic Attributes - Methods and apparatus for various embodiments of a text adjustment tool provide direct modification of typographic attributes of displayed text by translating indications of movement across a display, received via user input, into typographic attribute modifications. The text adjustment tool allows a user an intuitive method for modifying typographic attributes of displayed text without a user needing to access any menu windows or without even knowing which typographic attribute is being modified. The intuitive nature of the text adjustment tool is reflected by the fact that at no point during a modification of a typographic attribute of targeted text does a user's eyes ever need to leave the text being modified. | 05-23-2013 |
20130127704 | SPATIAL TOUCH APPARATUS USING SINGLE INFRARED CAMERA - The present disclosure relates to a spatial touch apparatus using a single infrared camera. More particularly, it relates to a spatial touch apparatus to implement a virtual touch screen in a free space by using an infrared light emitting diode (LED) array and a single infrared camera and to calculate X-axis and Z-axis coordinates of the infrared screen touched by a user indication object. Therefore, the present invention will provide tangible and interactive user interfaces to users and can implement more various user interfaces (UI) than a conventional 2D touch apparatus. | 05-23-2013 |
20130127705 | APPARATUS FOR TOUCHING PROJECTION OF 3D IMAGES ON INFRARED SCREEN USING SINGLE-INFRARED CAMERA - The present invention relates to an apparatus for touching a projection of a 3D image on an infrared screen using a single-infrared camera, and more specifically to an apparatus for touching a projection of a 3D image, which projects an image in a free space; recognizes a position touched by a user on the projected image and thus can process an order from a user on the basis of the recognized touched position. The present invention can provide tangible and interactive user interfaces to users. In particular, it is possible to implement various UIs (User Interface), in comparison to an apparatus for touching a projection of a 2D image of the related art, by using the Z-axial coordinate on the infrared screen as the information on depth. | 05-23-2013 |
20130127706 | METHOD FOR UNLOCKING SCREEN - A method for unlocking screen operable in an electronic apparatus having a screen is provided. The method includes, in a state that a display function of the screen is turned off, determining whether a trigger instruction received by the electronic apparatus corresponds to a trigger condition. If the trigger instruction corresponds to the trigger condition, in the state that the display function of the screen is turned off, determining whether an input instruction received by the electronic apparatus corresponds to an unlocking condition. If the input instruction corresponds to the unlocking condition, controlling the screen to turn on the display function thereof. | 05-23-2013 |
20130127707 | Method, an Apparatus and a Computer Program for Controlling an Output from a Display of an Apparatus - A method including: displaying information corresponding to a first output state, temporarily displaying information corresponding to a second output state while a user actuation is occurring; and displaying information corresponding to the first output state when the user actuation is no longer occurring. | 05-23-2013 |
20130127708 | CELL-PHONE BASED WIRELESS AND MOBILE BRAIN-MACHINE INTERFACE - Techniques and systems are disclosed for implementing a brain-computer interface. In one aspect, a system for implementing a brain-computer interface includes a stimulator to provide at least one stimulus to a user to elicit at least one electroencephalogram (EEG) signal from the user. An EEG acquisition unit is in communication with the user to receive and record the at least one EEG signal elicited from the user. Additionally, a data processing unit is in wireless communication with the EEG acquisition unit to receive and process the recorded at least one EEG signal to perform at least one of: sending a feedback signal to the user, or executing an operation on the data processing unit. | 05-23-2013 |
20130135188 | GESTURE-RESPONSIVE USER INTERFACE FOR AN ELECTRONIC DEVICE - This disclosure provides systems, methods and apparatus, including computer programs encoded on computer storage media, for providing a gesture-responsive user interface for an electronic device. In one aspect, an apparatus or electronic device has an interactive display that provides an input/output (I/O) interface to a user of the apparatus. The apparatus includes a processor, a light emitting source and at least two light sensors. A secondary optical lens structures emitted light from the light emitting source into at least one lobe. Each light sensor outputs, to the processor, a signal representative of a characteristic of received light, where the received light results from scattering of the structured emitted light by an object. The processor effectuates the I/O interface by recognizing, from the output of the light sensors, an instance of a user gesture, and to control the interactive display and/or the apparatus in response to the user gesture. | 05-30-2013 |
20130135189 | GESTURE-RESPONSIVE USER INTERFACE FOR AN ELECTRONIC DEVICE HAVING A COLOR CODED 3D SPACE - This disclosure provides systems, methods and apparatus, including computer programs encoded on computer storage media, for three dimensional position determination of an object. In one aspect, a first electromagnetic (EM) radiation and a second EM radiation is emitted, toward a position-sensing volume, the first and second EM radiation each having a respective, different, wavelength. Scattered radiation, resulting from interaction of the emitted first and second EM radiation with an object located in the position-sensing volume, is detected. Characteristics of the detected scattered radiation have a correlation with a position of the object in the position-sensing volume. A three dimensional position of the object is determined, from the correlation. The position-sensing volume may be proximate to, and extend above, an external surface of a display screen. | 05-30-2013 |
20130135190 | ELECTRONIC DEVICE, STORAGE MEDIUM, AND METHOD FOR EXECUTING MULTIPLE FUNCTIONS OF PHYSICAL BUTTON OF THE ELECTRONIC DEVICE - In a method for executing multiple functions of a physical button of an electronic device, multiple functions of the physical button are predefined, and a relationship between each of the multiple functions and each placement state of the electronic device is predefined. One of the placement states of the electronic device is detected when the physical button is pressed. A function of the physical button corresponding to the detected placement state of the electronic device is determined according to the predefined relationship. The electronic device is controlled to execute the determined function of the physical button. | 05-30-2013 |
20130135191 | INPUT DISPLAY APPARATUS, TANGIBLE COMPUTER-READABLE RECORDING MEDIUM AND INPUT DISPLAY METHOD - Disclosed is an input display apparatus including: a handwriting input unit to receive a handwriting input; a particle migration type of display unit to enable display contents to be partially rewritten; and a control unit to control a display operation of the display unit, for displaying each stroke which is input via the handwriting input unit; wherein the control unit controls the display unit so as to display a currently input stroke which is currently input via the handwriting input unit, in a simple display in which a delay required to display the currently input stroke is short as compared with a normal display. | 05-30-2013 |
20130135192 | GESTURE RECOGNITION APPARATUS, METHOD THEREOF AND PROGRAM THEREFOR - A gesture recognition apparatus detects a locus of a fingertip position of a user from an acquired moving image; sets an effective range configured to set an effective range to detect a locus of the fingertip position of the user from the moving image flap action; determines whether or not the locus of the fingertip position is of the flap action when the locus of the fingertip position is included in the effective range; and recognizes a gesture of the user from the flap action when the locus of the fingertip position is of the flap action. | 05-30-2013 |
20130135193 | ELECTROMECHANICAL SYSTEMS DISPLAY APPARATUS INCORPORATING CHARGE DISSIPATION SURFACES - This disclosure provides systems, methods and apparatus for dissipating charge buildup within a display element with a conductive layer. The conductive layer is maintained in electrical contact with a fluid within the display element. The fluid, in turn, remains in contact with light modulators within the display elements. Any charge buildup that may be caused by the filling of the fluid during fabrication of the display device, or during operation of the light modulators can be dissipated by the conductive layer. Thus, by dissipating the charge buildup, the conductive layer reduces or eliminates electrostatic forces due to the charge buildup that may affect the operability of the light modulators. The display can include conductive spacers in an active display region of the display and a spacer-free region that allows the substrates to deform while retaining an electrical connection between the conductive layer and the spacers in the active display region. | 05-30-2013 |
20130135194 | METHODS FOR CONTROLLING AN ELECTRIC DEVICE USING A CONTROL APPARATUS - An electrical switch apparatus including a movement sensitive form is disclosed. The apparatus includes a housing, a motion sensor and a processing unit, where motion on, near or about the motion sensor is translated into output commands adapted for list scrolling, where the list can be arranged in a hierarchy such as menus or for changing a value of an attribute of a electrical device under the control of the switch. | 05-30-2013 |
20130135195 | SYSTEMS FOR CONTROLLING AN ELECTRIC DEVICE USING A CONTROL APPARATUS - An electrical switch apparatus including a movement sensitive form is disclosed. The apparatus includes a housing, a motion sensor and a processing unit, where motion on, near or about the motion sensor is translated into output commands adapted for list scrolling, where the list can be arranged in a hierarchy such as menus or for changing a value of an attribute of an electrical device under the control of the switch. | 05-30-2013 |
20130135196 | METHOD FOR OPERATING USER FUNCTIONS BASED ON EYE TRACKING AND MOBILE DEVICE ADAPTED THERETO - An eye tracking based user function controlling method and a mobile device adapted thereto are provided. A camera unit of a mobile device is activated while a specific user function is executed. A gaze angle of a user's eye is acquired from an image obtained via the camera unit. An eye tracking function is executed in which execution state is controlled according to the gaze angle. | 05-30-2013 |
20130135197 | TRANSMISSIVE DISPLAY APPARATUS AND OPERATION INPUT METHOD - A transmissive display apparatus includes an operation section that detects an operational input issued through an operation surface, an image output section for the left eye and an image output section for the right eye that output predetermined image light, an image pickup section that visually presents the predetermined image light in an lavage pickup area that transmits externally incident light, a sensor that outputs a signal according to a positional relationship between the operation surface and the image pickup area, and a determination section that detects overlap between an optical image of the operation surface that has passed through the image pickup area and the predetermined image in the image pickup area based on the signal according to the positional relationship and receives the operational input when the optical image of the operation surface overlaps with the predetermined image. | 05-30-2013 |
20130135198 | Electronic Devices With Gaze Detection Capabilities - An electronic device may have gaze detection capabilities that allow the device to detect when a user is looking at the device. The electronic device may implement a power management scheme using the results of gaze detection operations. When the device detects that the user has looked away from the device, the device may dim a display screen and may perform other suitable actions. The device may pause a video playback operation when the device detects that the user has looked away from the device. The device may resume the video playback operation when the device detects that the user is looking towards the device. Gaze detector circuitry may be powered down when sensor data indicates that gazed detection readings will not be reliable or are not needed. | 05-30-2013 |
20130135199 | SYSTEM AND METHOD FOR USER INTERACTION WITH PROJECTED CONTENT - A system and method for user interaction with projected content are provided. A computer generated image is projected onto a surface, the computer generated image comprising at least one symbol. The projected computer generated image is imaged to obtain a sensor image. The location of the symbol within the sensor image is detected and based on the location of the symbol in the sensor image a device may be controlled. | 05-30-2013 |
20130135200 | Electronic Device and Method for Controlling Same - Provided is an electronic device whereby input characters can be used in a desired application by a simple operation, and a method for controlling the electronic device. While on screen, if a leftward shaking operation is detected again by a motion sensor, a control unit displays, instead of the screen of a notepad application, a browser application screen of a browser application different from the notepad application. While on screen, if a detecting unit detects the operation of a user's finger being lifted from the surface of a display unit, the control unit inputs the input text “BOU-SUI MOBILE”, displayed on the display unit, in a search box on the browser application and launches the browser application. | 05-30-2013 |
20130141324 | USER INTERFACE CONTROL BASED ON HEAD ORIENTATION - Embodiments distinguish among user interface elements based on head orientation. Coordinates representing a set of at least three reference points in an image of a subject gazing on the user interface elements are received by a computing device. The set includes a first reference point and a second reference point located on opposite sides of a third reference point. A first distance between the first reference point and the third reference point is determined. A second distance between the second reference point and the third reference point is determined. The computing device compares the first distance to the second distance to calculate a head orientation value. The computing device selects at least one of the user interface elements based on the head orientation value. In some embodiments, the head orientation value enables the user to navigate a user interface menu or control a character in a game. | 06-06-2013 |
20130141325 | PORTABLE DEVICE PAIRING WITH A TRACKING SYSTEM - In embodiments of portable device pairing with a tracking system, a pairing system includes a portable device that generates device acceleration gesture data responsive to a series of motion gestures of the portable device. The pairing system also includes a tracking system that is configured for pairing with the portable device. The tracking system recognizes the series of motion gestures of the portable device and generates tracked object position gesture data. A pairing service can then determine that the series of motion gestures of the portable device corresponds to the series of motion gestures recognized by the tracking system, and communicate a pairing match notification to both the tracking system and the portable device to establish the pairing. | 06-06-2013 |
20130141326 | GESTURE DETECTING METHOD, GESTURE DETECTING SYSTEM AND COMPUTER READABLE STORAGE MEDIUM - A gesture detecting method includes steps of defining an initial reference point in a screen of an electronic device; dividing the screen into N areas radially according to the initial reference point; when a gesture corresponding object moves in the screen and a trajectory of the gesture corresponding object crosses M of the N areas, selecting a sample point from each of the M areas so as to obtain M sample points; and calculating a center and a radius of the trajectory of the gesture corresponding object according to P of the M sample points so as to determine a circular or curved trajectory input. Accordingly, the invention is capable of providing a center, a radius, a direction and an arc angle corresponding to a gesture in real-time without establishing a gesture model. | 06-06-2013 |
20130141327 | GESTURE INPUT METHOD AND SYSTEM - A gesture input method is provided. The method is used in a gesture input system to control a content of a display. The method includes: capturing, by a first image capturing device, a hand of a user and generating a first grayscale image; capturing, by a second image capturing device, the hand of the user and generating a second grayscale image; detecting, by an object detection unit, the first and second grayscale images to obtain a first imaging position and a second imaging position corresponding to the first and second grayscale images, respectively; calculating, by a triangulation unit, a three-dimensional space coordinate of the hand according to the first imaging position and the second imaging position; recording, by a memory unit, a motion track of the hand formed by the three-dimensional space coordinate; and recognizing, by a gesture determining unit, the motion track and generating a gesture command. | 06-06-2013 |
20130141328 | Dynamic Interpretation of User Input in a Portable Electronic Device - The embodiments describe both the interpreting and modifying the interpretation of an input event to an electronic device having limited user input resources. The input event interpretation can be based in part on a connection state of the device. In some cases, the interpretation of the input event can also be based upon an indication of a current operating state of the device in addition to or exclusive of the connection state. Furthermore, in some embodiments, an operating state of the portable electronic device can be resolved based in part on the connection state of the portable electronic device. | 06-06-2013 |
20130141329 | Exchanging Information Between Devices In A Medical Environment - A medical device including an operation determiner for determining operations to be performed by the medical device in response to a gesture of the device with respect to the medical device, and an operation data accessor for accessing operation data for the operation performed by the medical device. | 06-06-2013 |
20130141330 | STEREOSCOPIC DISPLAY DEVICE - An object of the present invention is to provide a novel stereoscopic display device that allows the viewer to properly view a stereoscopic image. The device includes: a display configured to display an image for stereoscopic viewing; an imaging unit configured to image a face of a viewer; a position information acquisition unit configured to acquire position information regarding the face imaged by the imaging unit; an operation unit configured to be operated by the viewer when the viewer is in an optimal position from where the image for stereoscopic viewing displayed on the display can be properly viewed as a stereoscopic image; an optimal position information storage unit configured to store position information provided when the operation unit is operated as position information on the optimal position; and a positional relationship notification unit configured to notify the viewer of the positional relationship between the current position of the viewer and the optimal position. | 06-06-2013 |
20130147701 | METHODS AND DEVICES FOR IDENTIFYING A GESTURE - Methods and mobile electronic devices for identifying a gesture on a mobile electronic device are provided. In one embodiment, the method includes: obtaining camera data of a subject from a camera on the mobile electronic device; obtaining device movement data from a sensor on the mobile electronic device, the device movement data representing physical movement of the mobile electronic device; and based on the camera data and the device movement data, interpreting movement of the subject as a predetermined input command associated with a predetermined gesture when movement of the subject captured by the camera corresponds to the predetermined gesture. | 06-13-2013 |
20130147702 | Method, Apparatus, Computer Program and User Interface - A method, apparatus, computer program and user interface wherein the method comprises: detecting a user input at a first apparatus; determining that the user input was also detectable by a second apparatus; and causing a function to be performed where at least part of the function is performed by the first apparatus and at least part of the function is performed by the second apparatus. | 06-13-2013 |
20130147703 | DISPLAY DEVICE WITH OPTION INTERACTION - A display device with option interaction includes a display device and is characterized in that the display device is provided with an interaction device that is connected to a signal source. When an image is displayed on the display device to proceed with option interaction, the interaction device receives an option signal of the display device and provides backward transmission so as to provide the display device with a mechanism for interacting with a consultative program, making it easy to proceed with interaction for questionnaire, sampling, and voting, thereby allowing the user of the display device to participate in opinion expression for public affairs and improving efficiency, fairness, and impartiality in sampling society opinions. | 06-13-2013 |
20130147704 | ELECTRONIC DEVICE PROVIDING SHAKE GESTURE IDENTIFICATION AND METHOD - A method to identify a shake gesture using an electronic device. The electronic device reads reference acceleration values from an accelerometer of the electronic device when a user shakes the electronic device to establish the shake gesture. The electronic device generates the shake gesture by a sorting algorithm according to the reference acceleration values and associates the shake gesture with an application. The electronic device starts the application corresponding to the shake gesture. | 06-13-2013 |
20130147705 | DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A display apparatus and control method are provided. The display apparatus includes a storage unit which is provided to store biometric information of an individual user and metadata corresponding to the biometric information; a biometric information input unit through which biometric information of a plurality of users is input; a display unit; and a controller which determines metadata corresponding to a first user by comparing the input biometric information of the first user with the stored biometric information, determines metadata corresponding to a second user by comparing the input biometric information of the second user with the stored biometric information, and generates information of a user group comprising the first and second users based on common metadata between the metadata of the first user and the metadata of the second user. | 06-13-2013 |
20130147706 | MOBILE TERMINAL AND CONTROL METHOD THEREOF - A control method of a mobile terminal is disclosed. The control method of a mobile terminal includes: acquiring a pressure signal through a pressure sensing module for sensing a change in pressure applied to at least one part of the body in at least two degrees; and generating an event for changing a display state of a display unit through a control signal to be matched to the pressure signal. Emotional quality can be improved by changing a display state of a display unit in response to a change in pressure applied to the body. | 06-13-2013 |
20130154913 | SYSTEMS AND METHODS FOR A GAZE AND GESTURE INTERFACE - A system and methods for activating and interacting by a user with at least a 3D object displayed on a 3D computer display by at least the user's gestures which may be combined with a user's gaze at the 3D computer display. In a first instance the 3D object is a 3D CAD object. In a second instance the 3D object is a radial menu. A user's gaze is captured by ahead frame containing at least an endo camera and an exo camera worn by a user. A user's gesture is captured by a camera and is recognized from a plurality of gestures. User's gestures are captured by a sensor and are calibrated to the 3D computer display. | 06-20-2013 |
20130154914 | Casing - A casing, method and apparatus wherein the casing includes at least one user deformable portion; at least one sensor configured to detect deformation of the at least one user deformable portion; wherein the casing is configured to be removably coupled to an apparatus and is configured so that, in response to detecting the deformation of the user deformable portion of the casing, a control signal for control of the apparatus is provided. | 06-20-2013 |
20130154915 | METHODS, APPARATUSES, AND COMPUTER PROGRAM PRODUCTS FOR ENABLING USE OF REMOTE DEVICES WITH PRE-DEFINED GESTURES - Methods, apparatuses, and computer program products are herein provided for enabling use of external devices with pre-defined gestures. A method may include receiving operation information from a remote device, wherein the operation information indicates at least one operation that may be invoked by the remote device. The method may further include associating, by a processor, at least one pre-defined gesture with the at least one operation. The method may further include receiving user input. The method may further include determining that the user input corresponds to the at least one pre-defined gesture. The method may further include causing transmission of indication information to the remote device, wherein the indication information provides an indication to the remote device to perform the at least one operation associated with the pre-defined gesture. Corresponding apparatuses and computer program products are also provided. | 06-20-2013 |
20130154916 | METHOD AND SYSTEM FOR PROVIDING CENTRALIZED NOTIFICATIONS TO AN ADMINISTRATOR - Embodiments of the present disclosure provide a user interface that enables an administrator to monitor the status of one or more long-running processes executing on a system. According to one or more embodiments, information about the long-running processes is received, analyzed and converted into a single format. This information is then stored in a storage device in the single format. In response to a command request periodically received from a user interface, summary information about the one or more long-running processes is provided to, and displayed on, the user interface. Upon receipt of a user selection of at least a portion of the summary information, the user interface issues a second command request that is similar to the first command request, but includes additional parameters, to retrieve additional information about the selected summary information. Once the additional information is received, the additional information is presented on the user interface. | 06-20-2013 |
20130154917 | PAIRING A COMPUTING DEVICE TO A USER - A method for automatically pairing an input device to a user is provided herein. According to one embodiment, the method includes receiving an input from an unpaired input device within an observed scene, and calculating a position of the unpaired input device upon receiving the input. The method further includes detecting one or more users within the observed scene via a capture device, creating a candidate list of the one or more detected users determined to be within a vicinity of the unpaired input device, and assigning one detected user on the candidate list to the unpaired input device to initiate pairing. | 06-20-2013 |
20130154918 | ENHANCED USER EYE GAZE ESTIMATION - Systems, methods, and computer media for estimating user eye gaze are provided. A plurality of images of a user's eye are acquired. At least one image of at least part of the user's field of view is acquired. At least one gaze target area in the user's field of view is determined based on the plurality of images of the user's eye. An enhanced user eye gaze is then estimated by narrowing a database of eye information and corresponding known gaze lines to a subset of the eye information having gaze lines corresponding to a gaze target area. User eye information derived from the images of the user's eye is then compared with the narrowed subset of the eye information, and an enhanced estimated user eye gaze is identified as the known gaze line of a matching eye image. | 06-20-2013 |
20130154919 | USER CONTROL GESTURE DETECTION - The description relates to user control gestures. One example allows a speaker and a microphone to perform a first functionality. The example simultaneously utilizes the speaker and the microphone to perform a second functionality. The second functionality comprises capturing sound signals that originated from the speaker with the microphone and detecting Doppler shift in the sound signals. It correlates the Doppler shift with a user control gesture performed proximate to the computer and maps the user control gesture to a control function. | 06-20-2013 |
20130154920 | GUITAR INPUT AND OUTPUT DOCK FOR A TABLET COMPUTER - A guitar input and output dock for a tablet computer is disclosed. The guitar input and output dock includes a dock portion having a cavity configured and arranged to receive a tablet computer therein. The dock portion includes a guitar input, an audio output, and a foot pedal controller input. A foot pedal controller having a foot pedal is electrically connected to the foot pedal controller input. An electrical circuit in the dock portion is configured and arranged to transmit audio input from the guitar input and pedal press information from foot pedal controller to the tablet computer for audio processing and transmit the processed audio from the tablet computer to the audio output. | 06-20-2013 |
20130154921 | NAVIGATION DEVICE - An optical navigation device, including a radiation source capable of producing a beam of radiation; a sensor for receiving an image; and an optical element for identifying movement of an object on a first surface to thereby enable a control action to be carried out. The optical element is such that a whole of the imaged area of the first surface is substantially covered by the object in normal use. The device is operable to receive from the object on the first surface an input describing a pattern, to compare the received pattern to a stored reference pattern and to perform a predetermined function if the received pattern and stored reference pattern are substantially similar. The pattern may be a continuous line, the device being operable to store the continuous line as a set of turning points in chronological order. | 06-20-2013 |
20130154922 | CUSTOMIZABLE AND RECONFIGURABLE VIRTUAL INSTRUMENT PANEL - The invention provides an instrument control panel that is easily customized and reconfigured, and yet provides the familiar tactile sensation of physical knobs, sliders, and buttons. The instrument control panel comprises one or more interface components that are removably coupled to an interface display wherein the interface components communicate with one or more control components disposed behind the interface display. The present invention lends itself particularly well to an instrument panel. | 06-20-2013 |
20130162513 | USER GESTURE RECOGNITION - An apparatus comprising: at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus at least to perform: determining at least one parameter dependent upon a user gesture wherein the parameter is rotation invariant, having the same value when determined at different arbitrary orientations between the apparatus and the user gesture; and using the at least one parameter to determine an action in response to the user gesture. | 06-27-2013 |
20130162514 | GESTURE MODE SELECTION - An apparatus, system, and method are disclosed for gesture mode selection. An apparatus for gesture mode selection includes a detection module, a gesture mode module, and a gesture recognition module. The detection module detects a triggering event. The gesture mode module sets a gesture mode from an idle mode to an enhanced mode based on the detection of the triggering event. The gesture recognition module processes data from a non-contact input device to detect gestures according to the gesture mode set by the gesture mode module. | 06-27-2013 |
20130162515 | METHOD FOR CHANGING DEVICE MODES OF AN ELECTRONIC DEVICE CONNECTED TO A DOCKING STATION AND AN ELECTRONIC DEVICE CONFIGURED FOR SAME - The present disclosure provides a docking station for docking one or multiple portable electronic devices, such as a tablet and a mobile telephone or smartphone. The present disclosure provides a method for changing device modes of an electronic device connected to a docking station, as well as an electronic device and a docking station configured for same. | 06-27-2013 |
20130162516 | APPARATUS AND METHOD FOR PROVIDING TRANSITIONS BETWEEN SCREENS - An apparatus, method, and computer program product are described that determine a destination screen for display and provide for a visual transition between an origin screen and the destination screen based on a position of a user input and a direction of the movement component of the input. The origin screen may, for example, associate certain areas of the screen with certain destination screens, such that an input received in one area invokes one destination screen and an input received in another area invokes another destination screen. The destination screen may also be determined based on the direction of the movement component of the input. Thus, one of several destination screens may be accessible to the user and may be determined based on the characteristics of the input received. | 06-27-2013 |
20130162517 | Gesturing Architecture Using Proximity Sensing - Proximity based system and method for detecting user gestures. Each of a plurality of proximity sensing circuits may collect digital data. Each proximity sensing circuit may include an antenna configured to transmit and receive electromagnetic signals and a shield driver configured to shield signals transmitted by the antenna in one or more directions. The digital data may be collected based on electromagnetic signals received from another proximity sensing circuit via the antenna. The received electromagnetic signals may be modified by one or more user proximity gestures. The digital data from each of the plurality of proximity sensing circuits may be received by a coordinating circuit. The coordinating circuit may produce coordinated digital data from the digital data received from each of the plurality of proximity sensing circuits. The coordinated digital data may be configured for use in determining that a user performed the one or more user proximity gestures. | 06-27-2013 |
20130162518 | Interactive Video System - An interactive video system includes an image capturing device, for example a video camera, for capturing user motion, and a graphical display which is arranged to be altered in response to detection of user motion as captured by the image capturing device. A user interface is arranged to display a visual representation of the motion detected by the system to assist in calibrating the system in relation to a surrounding environment. | 06-27-2013 |
20130162519 | CROSS-PLATFORM HUMAN INPUT CUSTOMIZATION - An input handler may receive first human input events from at least one human input device and from at least one user, associate the first human input events with a first identifier, receive second human input events from the at least one human input device from the at least one user, and associate the second human input events with a second identifier. A command instructor may relate the first human input events and the second human input events to commands of at least one application, and instruct the at least one application to execute the commands including correlating each executed command with the first identifier or the second identifier | 06-27-2013 |
20130162520 | GESTURE DETECTION AND COMPACT REPRESENTATION THEREOF - Techniques are described that may be implemented with an electronic device to detect a gesture within a field of view of a sensor and generate a compact data representation of the detected gesture. In implementations, a sensor is configured to detect a gesture and provide a signal in response thereto. An estimator, which is in communication with the sensor, is configured to generate an elliptical representation of the gesture. Multiple coefficients for the compact representation of the gesture can be used to define the ellipse representing the gesture. | 06-27-2013 |
20130162521 | DEVICE AND METHOD FOR USER INTERACTION - Disclosed are a device for user interaction with a combined projector and camera and a method and a device for user interaction for recognizing an actual object to augment relevant information on a surface or a periphery of the actual object. The device for user interaction, includes: at least one projector-camera pair in which a projector and a camera are paired; a motor mounted in the projector-camera pair and configured to control a location and a direction of the projector-camera pair; and a body including a computer capable of including a wireless network and configured to provide connection with an external device, and a projection space and a photographing space of the projector-camera pair overlap each other. | 06-27-2013 |
20130162522 | METHOD AND APPARATUS FOR CONTROLLING POWER OF DISPLAY DEVICE USING FRONTAL FACE DETECTOR - Disclosed are a method and an apparatus for controlling power of a display device using a frontal face detecting device. The method for controlling power of a display device includes: determining whether a face of a user is present in front of the display device; confirming a power state of the display device; and repeatedly performing the determining of whether the face of a user is present according to determined presence of the face and the confirmed power state to control power of the display device. The method intelligently recognizes a use purpose of a user from the display device, thereby managing power of the display device, and controlling power of the display device without requiring a separate action by a user for managing device power. | 06-27-2013 |
20130162523 | SHARED WIRELESS COMPUTER USER INTERFACE - A system and method for use in a point-to-point enabled device includes establishing a wireless point-to-point connection with a remote device, transmitting a request message to the remote device requesting information to clone a user interface of the remote device at the point-to-point enabled device, receiving at the point-to-point enabled device the information to clone the user interface of the remote device, displaying on a display associated with the point-to-point enabled device a cloned image of the user interface of the remote device, receiving, at the point-to-point enabled device, user-entered input data associated with an application running on the remote device, and transmitting the user-entered input data from the point-to-point enabled device to the remote device via the wireless point-to-point connection. The transmitted user-entered input data is usable by the remote device as if that data was received at the remote device from a user of the remote device. | 06-27-2013 |
20130162524 | ELECTRONIC DEVICE AND METHOD FOR OFFERING SERVICES ACCORDING TO USER FACIAL EXPRESSIONS - A method for offering services according to facial expressions is provided. The method has an electronic device storing a service database recording at least one user's information. The method activates an offering service function; captures facial expressions of the user; extracting the features of the facial expressions; compares the extracted features with the features in images of the facial expressions stored in the service database, so as to identify a corresponding feature stored in the service database, and determines the type of expression and the service corresponding thereto from images of the user stored in the service database; and activates and provides the determined service. An electronic device using the method is also provided. | 06-27-2013 |
20130162525 | METHOD AND APPARATUS FOR PERFORMING MOTION RECOGNITION USING MOTION SENSOR FUSION, AND ASSOCIATED COMPUTER PROGRAM PRODUCT - A method and apparatus for performing motion recognition using motion sensor fusion and an associated computer program product are provided, where the method is applied to an electronic device. The method includes the steps of: utilizing a plurality of motion sensors of the electronic device to obtain sensor data respectively corresponding to the plurality of motion sensors, the sensor data measured at a device coordinate system of the electronic device, wherein the plurality of motion sensors includes inertial motion sensors; and performing sensor fusion according to the sensor data by converting at least one portion of the sensor data and derivatives of the sensor data into converted data based on a global coordinate system of a user of the electronic device, to perform motion recognition based on the global coordinate system, in order to recognize the user's motion. | 06-27-2013 |
20130162526 | APPARATUS AND METHOD FOR REDUCING CURRENT CONSUMPTION IN PORTABLE TERMINAL WITH FLEXIBLE DISPLAY - A portable device with a flexible display and a method for displaying information in the portable device with a flexible display are provided. The portable device includes a sensor for detecting at least one of a user's face and an ambient brightness, the flexible display including a first area and a second area, and a controller for displaying information in the first area when at least one of the user's face is detected and the ambient brightness has a value greater than a threshold, and for displaying the information in the second area when at least one of the user's face is not detected and the ambient brightness has value less than the threshold. | 06-27-2013 |
20130162527 | INTERACTION WITH PORTABLE DEVICES - A portable electronic device comprises means for generating or receiving a signal indicating that it has been placed in a predetermined position. The portable electronic device is configured upon receiving said signal to enable user interaction with the portable electronic device by a touchless interaction mode. Also disclosed are various arrangements of a peripheral device and a portable electronic device which cooperate to provide a touchless interaction mode—e.g. through transducers and/or processing means being provided in the peripheral device. | 06-27-2013 |
20130169519 | METHODS AND APPARATUS TO PERFORM A DETECTION OPERATION - A method and apparatus determine a difference value, the determined difference value reflecting a difference between a plurality of presence values. In an embodiment, the method and apparatus perform an operation associated with the plurality of presence values, based on the determined difference value. | 07-04-2013 |
20130169520 | BENDING THRESHOLD AND RELEASE FOR A FLEXIBLE DISPLAY DEVICE - A flexible display device and a method for accurately recognizing a user's flex input bending of the flexible display device is described. The present invention is able to discard unintentional flexing of the flexible display device while being able to accurately recognize a user's intended flex input command based on a number of bending degree thresholds. A first bending threshold must be overcome in order to initially recognize a user's flex input as a valid flex input command. Then the user's flex input must fall below a second bending threshold in order to cease the recognition of the user's flex input as a valid flex input. | 07-04-2013 |
20130169521 | DISPLAY DEVICE FOR PATTERN RECOGNITION INPUT - A display device includes: a substrate, a plurality of signal lines disposed on the substrate, at least one insulating layer disposed on the substrate, and a plurality of location references disposed on the substrate and in the same layer level as at least one of the signal lines, wherein arrangement of the plurality of location references varies depending on relative locations of the location references on a screen of the display device. | 07-04-2013 |
20130169522 | HARD KEY AND VEHICLE TERMINAL USING THE SAME - A disclosed is a vehicle terminal, which is capable of using hard keys configured to be arranged and modified according to a user's taste. In particular, the spatial arrangement of the hard keys may be changed, as well as their appearance. | 07-04-2013 |
20130169523 | ELECTRONIC DEVICE AND METHOD FOR RELIEVING VISUAL FATIGUE USING THE ELECTRONIC DEVICE - An electronic device includes a visual sensor and a display screen. The visual sensor senses whether a user is looking at the display screen. If the user is looking at the display screen, the electronic device adjusts a font size of a font being displayed on the display screen. If the user looks at the display screen for not less than a first predefined time, the electronic device prompts the user to have a rest and turn off the display screen. After the electronic device has been turned off for more than a second predefined time, the display screen is turned on again automatically. | 07-04-2013 |
20130169524 | ELECTRONIC APPARATUS AND METHOD FOR CONTROLLING THE SAME - An electronic apparatus and a control method thereof are provided. In response to receiving a voice start command by the electronic apparatus, the control method of the electronic apparatus changes mode of the electronic apparatus to a voice task mode in which the electronic apparatus is controlled in accordance with user's voice, displays voice guide information including a plurality of voice commands for commanding the electronic apparatus to perform tasks in the voice task mode on the electronic apparatus, and changes at least one voice command from among the plurality of voice commands with a different voice command. Accordingly, the user is able to control the electronic apparatus more conveniently and efficiently. | 07-04-2013 |
20130169525 | ELECTRONIC APPARATUS AND METHOD FOR CONTROLLING THE SAME - An electronic apparatus and a control method thereof are provided, which displays first voice guide information indicating voice commands available to control the electronic apparatus, and if a command to control an external device connected to the electronic apparatus is received, changes the first voice guide information and displays second voice guide information to indicating voice commands available to control the external device. | 07-04-2013 |
20130169526 | SYSTEMS AND METHODS FOR MOBILE DEVICE PAIRING - Tools (systems, apparatuses, methodologies, computer program products, etc.) for pairing electronic devices including touchscreen-enabled electronic devices, and for facilitating communication between paired electronic devices. | 07-04-2013 |
20130169527 | INTERACTIVE VIDEO BASED GAMES USING OBJECTS SENSED BY TV CAMERAS - A method and apparatus for interactive TV camera based games in which position or orientation of points on a player or of an object held by a player are determined and used to control a video display. Both single camera and stereo camera pair based embodiments are disclosed, preferably using stereo photogrammetry where multi-degree of freedom information is desired. Large video displays, preferably life-size may be used where utmost realism of the game experience is desired. | 07-04-2013 |
20130169528 | DISPLAY DEVICE - A display device including a display panel to display images, a bezel provided at an outside of the display panel, and a signal input unit including a camera and a microphone, wherein the signal input unit is disposed at an upper end of the bezel, is disclosed. The camera and the microphone are integrated into the display device, thereby improving installation efficiency and external appearance quality of the display device. The microphone is distant from the speaker, thereby improving performance of the microphone. | 07-04-2013 |
20130169529 | ADJUSTING AN OPTICAL GUIDE OF A THREE-DIMENSIONAL DISPLAY TO REDUCE PSEUDO-STEREOSCOPIC EFFECT - A device may include sensors, a display, an optical guide, and one or more processors. The sensors may obtaining tracking information associated with a user. The display may include pixels for displaying images. The optical guide may include optical elements, each of the optical elements directing light rays from one or more of the pixels. In addition, the one or more processors may be configured determine a relative location of the user based on the tracking information obtained by the sensors, obtain values for control variables that are associated with the optical elements based on the relative location of the user, display a stereoscopic image via the display, and control the optical elements based on the values to direct the stereoscopic image to the relative location and to prevent a pseudo-stereoscopic image from forming at the relative location. | 07-04-2013 |
20130176201 | Corner Control - Methods, Methods, program products, and systems for corner control are described. Each of the four corners of a rectangular display field can be individually configured to be a rounded corner or an angled corner. In some implementations, a method can include providing a user interface item for display. The user interface item can include four control elements. Each of the control elements can correspond to a corner of a display field. Each of the control elements can individually and independently control a shape of the corresponding corner of the display field. The display field can have one or more corners in rounded shape and one or more corners in angled shape, according to user input received through the user interface item. | 07-11-2013 |
20130176202 | MENU SELECTION USING TANGIBLE INTERACTION WITH MOBILE DEVICES - An electronic device (such as a mobile device) displays on a screen of the device, a live video captured by a camera in the device. While the live video is being displayed, the device checks if a first predetermined condition is satisfied. When the first predetermined condition is satisfied the device displays a menu on the screen. The menu includes multiple menu areas, one of which is to be selected. While the menu is being displayed on the screen, the device checks if a second predetermined condition is satisfied, e.g. by a movement of a predetermined object in real world outside the device. When the second predetermined condition is satisfied, the device displays on the screen at least an indication of a menu area as being selected from among multiple menu areas in the displayed menu. | 07-11-2013 |
20130176203 | DISPLAY APPARATUS AND METHOD OF DISPLAYING THREE-DIMENSIONAL IMAGE USING THE SAME - A display apparatus includes a display panel and a liquid crystal lens. The display panel includes a first pixel displaying an N-th portion of a left image corresponding to an N-th position in the left image and a second pixel displaying an N-th portion of a right image corresponding to the N-th position in the right image. The second pixel is spaced apart from the first pixel with additional pixels disposed between the second and first pixels. The liquid crystal lens includes a unit lens directing the N-th left image to a left eye of an observer and the N-th right image to a right eye of the observer where N is a positive integer. | 07-11-2013 |
20130176204 | ELECTRONIC DEVICE, PORTABLE TERMINAL, COMPUTER PROGRAM PRODUCT, AND DEVICE OPERATION CONTROL METHOD - According to one embodiment, an electrnic device includes]: a connected device information management module; a connected device information sending module; an operation state sending module; and a connected device operation module. The connected device information management module is configured to manage function information of an external device connected to the electronic device. The connected device information sending module is configured to send the function information of the external device to a portable terminal in response to a transmission request from the portable terminal. The operation state sending module is configured to send an operation state of the electronic device with respect to the external device in response to the transmission request from the portable terminal. The connected device operation module is configured to perform a functional operation of the external device specified by a request sent from the portable terminal. | 07-11-2013 |
20130176205 | ELECTRONIC APPARATUS AND CONTROLLING METHOD FOR ELECTRONIC APPARATUS - According to one embodiment, an electronic apparatus configured to control an external apparatus, the electronic apparatus includes, a communicator configured to communicate with the external apparatus, a recognition module configured to recognize a connection apparatus connected to the external apparatus, an operation menu generator configured to generate an operation menu which is used to control the external apparatus or the connection apparatus, and a controller configured to control in such a manner that a divided operation screen including display regions which are used to display the operation menus is displayed in a display and a signal is transmitted to the external apparatus in accordance with an operation in the divided operation screen. | 07-11-2013 |
20130176206 | CAMERA BASED SENSING IN HANDHELD, MOBILE, GAMING, OR OTHER DEVICES - Method and apparatus are disclosed to enable rapid TV camera and computer based sensing in many practical applications, including, but not limited to, handheld devices, cars, and video games. Several unique forms of social video games are disclosed. | 07-11-2013 |
20130176207 | IMPLANTED DEVICES AND RELATED USER INTERFACES - Embodiments of the invention generally relate to electronic devices capable of being implanted beneath the skin of a human user. The electronic devices include input devices for receiving input from a user, and output devices for output signals or information to a user. The electronic devices may optionally include one or more sensors, batteries, memory units, and processors. The electronic devices are protected by a protective packaging to reduce contact with bodily fluids and to mitigate physiological responses to the implanted devices. | 07-11-2013 |
20130176208 | ELECTRONIC EQUIPMENT - A mobile phone which is an example of electronic equipment includes an infrared camera and an infrared LED. The infrared camera is arranged above a display and the infrared LED is arranged below the display. A user, by an eye-controlled input, designates a button image or a predetermined region on a screen. When a line of sight is to be detected, an infrared ray (infrared light) emitted from the infrared LED arranged below the display is irradiated to a lower portion of a pupil. Accordingly, even in a state that the user slightly closes his/her eyelid, the pupil and a reflected light of the infrared light can be imaged. | 07-11-2013 |
20130176209 | INTEGRATION SYSTEMS AND METHODS FOR VEHICLES AND OTHER ENVIRONMENTS - Described herein are integration systems and methods for providing a cohesive user experience within an environment of a plurality of products. An exemplary integration system may comprise a gateway for bidirectional communication with the products. The integration system may also comprise a central or remote database defining the user interface design for a plurality of products, and a database that provides new features for a plurality of extensible products. An input to an interaction point on a first of the products may be transmitted to the gateway, which may interpret the input and transmit output instructions to the first product or to another product in the environment. Audio, visual, and tactile inputs and outputs may be provided through the gateway. In some embodiments, the database may provide a user interface skin or theme to the products, so that a standard look and feel is provided across the plurality of products. | 07-11-2013 |
20130176210 | IMAGE VALIDATION SYSTEM FOR REMOTE DISPLAYS - Systems, methods and devices are disclosed that provided for providing remote video feedback. The disclosure herein includes a transmitting computing device that is configured to produce an image frame encoded with metadata and to transmit the encoded image frame over a network to a remote display device. The remote display device includes a viewing screen contained in a chassis with bevel. The viewing screen includes an information validation pixel with the chassis extending over and covering the image validation pixel. The chassis also includes a photo sensor positioned on the under side of the bezel so as to be in operational view of the image validation pixel. The system also includes a sensor electronics module that is configured to detect the metadata when the metadata is displayed by the information content pixel and is received by the photo sensor. The sensor electronics module also generates image frame feedback information based upon the detecting. A remote computing device is included that is configured to receive the encoded image frame from the transmitting computer, render the encoded image frame upon the viewing screen and re-transmit the image frame feedback information to the transmitting computer from the sensor electronics module. | 07-11-2013 |
20130176211 | IMAGE PROCESSING DEVICE FOR DISPLAYING MOVING IMAGE AND IMAGE PROCESSING METHOD THEREOF - Data of a moving image has a hierarchical structure comprising a 0-th layer, a first layer, a second layer, and a third layer in a z axis direction. Each layer is composed of moving image data of a single moving image expressed in different resolutions. Both the coordinates of a viewpoint at the time of the display of a moving image and a corresponding display area are determined in a virtual three-dimensional space formed by an x axis representing the horizontal direction of the image, a y axis representing the vertical direction of the image, and a z axis representing the resolution. By providing a switching boundary for layers with respect to the z axis, the layers of the moving image data used for frame rendering are switched in accordance with the value of z of the frame coordinates. | 07-11-2013 |
20130181892 | Image Adjusting - An apparatus including a display configured to display an image; and a system for adjusting the image on the display based upon location of a user of the apparatus relative to the apparatus. The system for adjusting includes a camera and an orientation sensor. The system for adjusting is configured to use signals from both the camera and the sensor to determine the location of the user relative to the display. | 07-18-2013 |
20130181893 | ELECTROSTATICALLY TRANSDUCED SENSORS COMPOSED OF PHOTOCHEMICALLY ETCHED GLASS - This disclosure provides systems, methods and apparatus for glass electromechanical systems (EMS) electrostatic devices. In one aspect, a glass EMS electrostatic device includes sidewall electrodes. Structural components of a glass EMS electrostatic device such as stationary support structures, movable masses, coupling flexures, and sidewall electrode supports, can be formed from a single glass body. The glass body can be a photochemically etched. In some implementations, pairs of sidewall electrodes can be arranged in interdigitated comb or parallel plate configurations and can include plated metal layers and narrow capacitive gap spacing. | 07-18-2013 |
20130181894 | DISPLAY DEVICE AND METHOD FOR LARGE SCREEN - A display device includes a display screen, a distance sensing unit, set on an appropriate position of the display screen to sense a relative distance from a user to the display device; a processing unit comprising a projection position determining module, a stretching area determining module, and a display control module. The projection position determining module calculates a projected position of the user on the display screen according to a position of the distance sensing unit on the display screen, the projected position is the position on the display screen where the user is projected on. The stretching area determining module determines an area needed to be stretched on the display screen according to the projected position on the display screen and a predetermined distance. The display control module stretches content displayed on the area needed to be stretched according to a predetermined percentage. The display method is also provided. | 07-18-2013 |
20130181895 | AUTOSTEREOSCOPIC THREE-DIMENSIONAL IMAGE DISPLAY DEVICE USING TIME DIVISION - An autostereoscopic 3D image display device using time division is provided. The image display device includes a backlight, an image display panel, a controller, and a viewer position tracking system. The backlight includes a plurality of line sources which are disposed at certain intervals. The image display panel displays a 3D image. The controller controls the backlight and a viewing point image of the image display panel. The viewer position tracking system determines a pupil position of a viewer and transfers position information to the controller. The image display panel provides two or more viewing points. The line sources configure two or more line source sets that are separately driven at the same time. The two or more line source sets are sequentially driven. | 07-18-2013 |
20130181896 | INTEGRATED LIGHT EMITTING AND LIGHT DETECTING DEVICE - Methods and systems for providing a light device that can emit light and sense light are disclosed. In one embodiment, a lighting device includes a light guide having a planar first surface, the light guide configured such that at least some ambient light enters the light guide through the first surface and propagates therein, and at least one light detector disposed along an edge of the light guide, the at least one detector optically coupled to the light guide to receive light propagating therein. The light detector can be configured to produce a control signal. In some embodiments, the lighting device also includes at least one light turning feature disposed on the first surface, the at least one light turning feature configured to direct light incident into the light guide through the first surface. | 07-18-2013 |
20130181897 | OPERATION INPUT APPARATUS, OPERATION INPUT METHOD, AND PROGRAM - A motion input is appropriately identified. A display is disposed in front of an operator, and a motion, performed by the operator within ranges set in predetermined right and left positions between the operator and the display, is identified. In certain circumstances, the shape of a finger of the operator may be handled as an object of an operational determination. | 07-18-2013 |
20130187845 | ADAPTIVE INTERFACE SYSTEM - An adaptive display system includes a user interface having a display for generating a visual output to a user, a sensor for capturing an image of a tracking region of the user and generating a sensor signal representing the captured image, and a processor in communication with the sensor and the display, wherein the processor receives the sensor signal, analyzes the sensor signal based upon an instruction set to determine a displacement of the tracking region of the user, and controls a visual indicator presented on the display based upon the displacement of the tracking region of the user. | 07-25-2013 |
20130187846 | BI-ORTHOGONAL PIXEL INTERPOLATION - The techniques described in this disclosure are directed to interpolating pixel values. In some examples, the techniques interpolate a pixel value for an interpolated center pixel based on pixel values of pixel that reside on diagonal lines that are orthogonal to one another. The techniques may determine first order derivative values and, in some examples, second order derivative values to determine which pixels to utilize to interpolate the pixel values for the interpolated center pixel. The techniques may similarly determine pixel values for non-center interpolated pixels using orthogonal vertical and horizontal lines. | 07-25-2013 |
20130187847 | IN-CAR EYE CONTROL METHOD - An in-car eye control method is provided to monitor a car driver's eye movements, thereby allowing the driver to communicate with people outside the car or control in-car equipment. The in-car eye control device essentially includes an image capturing unit for capturing an image of the driver, an processing unit for performing operations on the image taken, an eye determination unit for monitoring the driver's eye movements and generating an eye control instruction, and a system operation unit for executing a system process according to the eye control instruction. Thus, the in-car eye control method provides a diversity of communication or control solutions. | 07-25-2013 |
20130187848 | ELECTRONIC DEVICE, CONTROL METHOD, AND COMPUTER PRODUCT - An electronic device includes an input control processing unit that culls first input data by generating second input data from the first input data; an output control processing unit that culls output data; and a congestion control unit that instructs the input control processing unit to increase a quantity of the first input data to be culled, when delay in output relative to input is increasing, and instructs the output control processing unit to increase a quantity of the output data to be culled when the delay in output relative to input continues to increase despite the increase in the quantity of the first input data to be culled. | 07-25-2013 |
20130187849 | COORDINATE INFORMATION UPDATING DEVICE - An object can be displayed on a screen of a two-dimensional coordinate system based on xyz-coordinate values of the object in a three-dimensional coordinate system, operation information of a two-dimensional coordinate system with respect to the object can be received from an input device, and whether the operation information is in accordance with a predetermined rule or not is determined. If the operation information is not in accordance with the predetermined rule, xy-coordinate values of the object can be updated in accordance with the operation information. If the operation information is in accordance with the predetermined rule, a z-coordinate value of the object can be updated in accordance with the operation information. | 07-25-2013 |
20130187850 | MOBILE DEVICE DISPLAY CONTENT BASED ON SHAKING THE DEVICE - When a user shakes a mobile device, such as a smart phone, new content is shown on the display of the mobile device. In one embodiment, the content depends on the page or location the user is currently on within the mobile app or mobile browser page and can depend also on the history or experience level of the user. | 07-25-2013 |
20130194171 | Apparatus and Method for Mapping User-Supplied Data Sets to Reference Data Sets in a Variable Data Campaign - In accordance with one aspect of the present disclosure, apparatus are provided that allow for the automatic categorization of the subsets of a user-supplied data set, for example recipient lists for a multivariable distribution campaign. A user interface is disclosed that facilitates the uploading of the user's recipient list. A categorizer is disclosed which categorizes subsets of the user-supplied data. A storage mechanism is disclosed which stores reference categories of data expected by the multi-variable campaign logic. A mapper is disclosed which maps the user supplied data categorized by the categorizer to the reference categories stored by the storage mechanism. | 08-01-2013 |
20130194172 | DISABLING AUTOMATIC DISPLAY SHUTOFF FUNCTION USING FACE DETECTION - Techniques and equipment for disabling an automatic display shutoff function of a mobile device based on face detection are disclosed. The automatic display shutoff function generally disables or deactivates the display after a predetermined period of time, during which there is no input from a user of the mobile device. An image captured using a front facing camera of the mobile device may be used to detect at least a portion of the user's face and thus, override the automatic display shutoff function before expiration of the period of time without detecting any user input and during execution of an application program associated with the display of information for a time longer than the predetermined period of time. | 08-01-2013 |
20130194173 | TOUCH FREE CONTROL OF ELECTRONIC SYSTEMS AND ASSOCIATED METHODS - Various embodiments of electronic systems and associated methods of hands-free operation are described. In one embodiment, a method includes acquiring an image of a user's finger and/or an object associated with the user's finger with a camera, recognizing a gesture of the user's finger or the object based on the acquired image, and determining if the recognized gesture correlates to a command or a mode change for a processor. If the monitored gesture correlates to a command for a processor, the method includes determining if the processor is currently in a standby mode or in a control mode. If the processor is in the control mode, the method includes executing the command for the processor; otherwise, the method includes reverting to monitoring a gesture of the user's finger. | 08-01-2013 |
20130194174 | INPUT UNIT - There is provided an input unit adapted for non-contact input manipulation, which permits a user to smoothly accomplish an intended input manipulation. The input unit includes: a position detecting portion for detecting a position of a manipulating object such as a user's hand manipulating the input unit; a position change detecting portion for detecting a change in the position of a point on the manipulating object based on a detection output from the position detecting portion, the point being the closest to the position detecting portion; and an image display section. The position change detecting portion detects the change in the position of the point closest to the position detecting portion in a predetermined area. The image display section changes the display image according to a detection output from the position change detecting portion. | 08-01-2013 |
20130194175 | MOVEMENT CONTROL DEVICE, CONTROL METHOD FOR A MOVEMENT CONTROL DEVICE, AND NON-TRANSITORY INFORMATION STORAGE MEDIUM - A movement restriction unit restricts a movement of a movement subject within a display screen or a virtual space in a case where a numerical value indicating a magnitude of a displacement between a position of a hand or the like of a user and a reference position or a reference direction is smaller than the first reference value. A first movement control unit moves the movement subject at a first speed in a case where the above-mentioned numerical value is equal to or larger than the first reference value and smaller than a second reference value that is larger than the first reference value. A second movement control unit moves the movement subject at a second speed that is faster than the first speed in a case where the above-mentioned numerical value is equal to or larger than the second reference value. | 08-01-2013 |
20130194176 | METHOD FOR ADJUSTING DISPLAY MANNER OF PORTABLE ELECTRONIC DEVICE - A portable electronic device includes a display screen, a gravity sensor, at least one sensor provided at a side of the portable electronic device and a processor. The gravity sensor detects gravity information of the portable electronic device and generates a sensing value. The at least one sensor senses a holding position. The processor determines a holding manner according to sensing signals of the at least one sensor and determines a display mode of the display screen according to the sensing value and the holding manner. | 08-01-2013 |
20130194177 | PRESENTATION CONTROL DEVICE AND PRESENTATION CONTROL METHOD - A presentation control device casually presents information to the user based on how the user is watching a content item. In presenting the information to the user, the presentation control device: presents a sensory stimulus element having a first stimulation degree; changes a stimulation degree of the sensory stimulus element from the first stimulation degree, based on the magnitude of the reaction determined by the user reaction analyzing unit, and presents the sensory stimulus element; and, in the case where the magnitude of the reaction to the sensory stimulus element is smaller than a predetermined threshold, causes a sensory stimulus control unit to decrease the stimulation degree of the sensory stimulus element or stop presenting the sensory stimulus element. The reaction is observed within a predetermined time period which begins when the sensory stimulus element having the first stimulation degree is presented. | 08-01-2013 |
20130194178 | DISPLAY DEVICE AND DISPLAY METHOD - A display device including a display unit that displays an image; an operation detection unit that detects designated positions on the display unit respectively designated by two or more fingers of an operator; an operation analysis unit that determines, when the operation detection unit detects that the designated positions are moving in parallel in a first region of the display unit, a position of a second region newly generated in the first region and a display direction of an image displayed in the second region, based on the designated positions and a direction of movement of the designated positions; and a display control unit that controls the display unit to display an image in the second region, according to the position of the second region and the display direction of the image displayed in the second region determined by the operation analysis unit. | 08-01-2013 |
20130194179 | PROGRAM, INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING SYSTEM - An information processing device that stores an execution environment of a personal computer when a capture is made, and associates the captured execution environment, a dot pattern, and a screen shot. A control unit of the information processing device executes steps of: storing an image that is displayed on a display unit connected with the information processing device in a storage unit of the information processing device, and, storing an execution environment of the information processing device at a moment of storing the image, in the storage unit; superimposedly forming a dot pattern and the image; acquiring the code value and coordinate values from an optical reading unit that read the dot pattern; and restoring the execution environment corresponding to the code value and coordinate values by identifying the execution environment based on the identification information that is defined in at least part of the acquired code value. | 08-01-2013 |
20130201092 | APPARATUS AND METHOD FOR PROVIDING A VISUAL INDICATION OF AN OPERATION - An apparatus, method, and computer program product are described that provide for a visual indication of an operation to be performed, such that a user, upon applying a first portion of a user input, may be able to determine from the visual indication whether the input applied, when completed, will cause the desired operation to be performed. The apparatus may provide for the display of a screen having a representation of interactive content, and an input may be received from the user that includes a contact component and a movement component, such as a stroke gesture. In response to receipt of the contact component and prior to receipt of the movement component, the apparatus may provide for a visual indication of an operation to be performed. The operation may be executed based on receipt of at least a portion of the movement component. | 08-08-2013 |
20130201093 | PORTABLE DEVICE AND METHOD FOR CONTROLLING THE SAME - A method for controlling a portable device is provided. The method includes detecting bending of the portable device and determining whether to perform motion sensing correction due to the bending; acquiring a motion sensing correction factor for performing the motion sensing correction due to the bending; performing motion sensing correction of at least one motion sensor using the motion sensing correction factor; and controlling the portable device according to the corrected motion sensing. | 08-08-2013 |
20130201094 | VIRTUAL IMAGE DEVICE - This document describes various apparatuses embodying, and techniques for implementing, a virtual image device. The virtual image device includes a projector and a lens configured to generate a virtual image as well as two diffraction gratings, substantially orthogonally-oriented to each other, that act to increase a field-of-view of the virtual image. The virtual image device can be implemented as a pair of eyeglasses and controlled to generate the virtual image in front of lenses of the eyeglasses so that a wearer of the eyeglasses, looking through the lenses of the eyeglasses, sees the virtual image. | 08-08-2013 |
20130201095 | PRESENTATION TECHNIQUES - Techniques involving presentations are described. In one or more implementations, a user interface is output by a computing device that includes a slide of a presentation, the slide having an object that is output for display in three dimensions. Responsive to receipt of one or more inputs by the computing device, how the object in the slide is output for display in the three dimensions is altered. | 08-08-2013 |
20130201096 | DISPLAY METHOD OF AN AUTO-STEREOSCOPIC DISPLAY DEVICE - An auto-stereoscopic display device includes N pixel groups for displaying N visual angle images respectively, and an optical element for guiding the N visual angle images to corresponding N position groups respectively, a user located at a first to a (N−1)th position groups receives correct visual angle images, a user located at a Nth position group receives wrong visual angle images. A display method of the auto-stereoscopic display device includes the auto-stereoscopic display device displaying the N visual angle images; the auto-stereoscopic display device determining whether a user is located at the Nth position group; and when a user is located at the Nth position group, the auto-stereoscopic display device shifting the N position groups in order to let no user be located at the Nth position group. | 08-08-2013 |
20130201097 | METHODS AND DEVICES TO PROVIDE COMMON USER INTERFACE MODE BASED ON SOUND - A method and electronic devices to provide a common user interface mode on a first electronic device and a second electronic device are described. In one aspect, the method comprises: participating in audio based communications between the first electronic device and the second electronic device; determining an orientation of the first electronic device relative to the second electronic device based on the audio based communications; and entering a common user interface mode in which a display associated with the first electronic device and a display associated with the second electronic device operate cooperatively and in which the orientation of a common user interface displayed on the displays is determined in accordance with the determined relative orientation. | 08-08-2013 |
20130201098 | ADJUSTMENT OF A PARAMETER USING COMPUTING DEVICE MOVEMENT - In general, techniques and systems for controlling a parameter of a target device are described. In one example, a method includes obtaining, by a computing device, control information from an identification device that is associated with a target device, wherein the control information identifies the target device and a parameter that at least partially defines operation of the target device. The method may also include detecting, by at least one sensor of the computing device, physical movement of the computing device subsequent to obtaining the control information and transmitting the control information and movement information to a networked device configured to adjust a value of the parameter of the target device based at least in part on a function of the movement information. The movement information may include information that indicates the detected physical movement. | 08-08-2013 |
20130201099 | METHOD AND SYSTEM FOR PROVIDING A MODIFIED DISPLAY IMAGE AUGMENTED FOR VARIOUS VIEWING ANGLES - An image augmentation method for providing a modified display image to compensate for an oblique viewing angle by measuring a viewing position of a viewer relative to a display screen; calculating a three-dimensional position of the viewer relative to the display screen; calculating an angular position vector of the viewer relative to the display screen; generating a rotation matrix as a function of the angular position vector; calculating a set of perimeter points; generating a modified image as a function of a normal image and the previously calculated perimeter points; and rendering the modified image on the display screen. | 08-08-2013 |
20130201100 | INTERACTIVE INPUT SYSTEM AND METHOD OF DETECTING OBJECTS - A method comprises capturing image frames of an input area using a plurality of imaging devices, each having a field of view encompassing at least a portion of the input area; processing captured image frames to identify a plurality of targets therein; analyzing the identified plurality of targets to determine if the targets represent a plurality of projections of an input object; and if so, identifying a pattern of the projections thereby to identify the input object. | 08-08-2013 |
20130201101 | Electronic Device With Multiple Display Modes and Display method Of The Same - An electronic device having a plurality of display modes is described. The electronic device includes a first and a second display screen, wherein the second display screen is a foldable display screen. The display method includes detecting a screen state of the second display screen; displaying in the first display mode when the screen state is a first state; displaying in the second display mode when the screen state is a second state, wherein the first display screen is divided into a first part shaded by the second display screen and a second part. In the second display mode, the first picture is displayed using a combination of at least part of the second display screen and the second part of the first display screen while a second picture is displayed on the at least part of the second display screen. | 08-08-2013 |
20130201102 | MOBILE COMMUNICATION DEVICE WITH THREE-DIMENSIONAL SENSING AND A METHOD THEREFORE - A mobile communication device and a method for three-dimensional sensing of objects in a spatial volume above a display of the mobile communication device. The mobile communication device comprises input means having at least two sensors configured for collecting data about objects in said spatial volume and a processing logic for processing spatial object data. The method comprises and the mobile communication device in configured to perform the following steps; receiving an detection signal from at least one of the sensors indicating that an object is present above the display; determining the distance to the detected object; looking up weight parameters associated with each sensor in a look up table, said weight parameters being dependable on the determined distance; collecting data about the detected object from each sensor; and calculating the position of the detected object by using the collected data together with the looked up weight parameters. | 08-08-2013 |
20130207885 | Motion Controlled Image Creation and/or Editing - Methods, apparatus, and computer readable medium for creating and/or editing images are disclosed. An electronic tablet device may include a display, touch sensor, a motion sensor, controller and speaker. The motion sensor may generate input signals indicative of a spatial movement of the electronic tablet device. The controller may receive the input signals and generate output signals that move a drawing tool across a canvas of the display based on the spatial movement of the electronic device. The output signals may further update the canvas to reflect an effect of moving the drawing tool across the canvas per the tilting movement. | 08-15-2013 |
20130207886 | VIRTUAL-PHYSICAL ENVIRONMENTAL SIMULATION APPARATUS - A reactive virtual-physical perception suit apparatus, adapted for interactivity between a virtual environment and a physical environment is disclosed. A reactive virtual-physical perception circuit apparatus is adapted to process information transformation matrices between the virtual environment and the physical environment. The reactive virtual-physical perception suit apparatus is adapted for training environment emulations. | 08-15-2013 |
20130207887 | HEADS-UP DISPLAY INCLUDING EYE TRACKING - Embodiments of an apparatus comprising a light guide including a proximal end, a distal end, a display positioned near the proximal end, an eye-tracking camera positioned at or near the proximal end to image eye-tracking radiation, a proximal optical element positioned in the light guide near the proximal end and a distal optical element positioned in the light guide near the distal end. The proximal optical element is optically coupled to the display, the eye-tracking camera and the distal optical element and the distal optical element is optically coupled to the proximal optical element, the ambient input region and the input/output region. Other embodiments are disclosed and claimed. | 08-15-2013 |
20130207888 | METHOD AND APPARATUS FOR PRESENTING AN OPTION - A method of presenting an option comprises: calculating a movement range of an object; calculating an area of a display device based on the movement range; and presenting at least one option in the area of the display device. | 08-15-2013 |
20130207889 | System and Method of Biomechanical Posture Detection and Feedback Including Sensor Normalization - A system and method are described herein for a sensor device which biomechanically detects in real-time a user's movement state and posture and then provides real-time feedback to the user based on the user's real-time posture. The feedback is provided through immediate sensory feedback through the sensor device (e.g., a sound or vibration) as well as through an avatar within an associated application with which the sensor device communicates. The sensor device detects the user's movement state and posture by capturing data from a tri-axial accelerometer in the sensor device. Streamed data from the accelerometer is normalized to correct for sensor errors as well as variations in sensor placement and orientation. Normalization is based on accelerometer data collected while the user is wearing the device and performing specific actions. | 08-15-2013 |
20130207890 | METHODS DEVICES AND SYSTEMS FOR CREATING CONTROL SIGNALS - A interface comprising a hand operated input device with a series of activation points activated by the digits (fingers and/or thumb) of a user; a sensor component measuring a current motion, orientation, and/or position of the input device and a output component interconnected to the activation points and the sensor component for outputting in a series the currently active activation points and the current motion, orientation, and/or position of the input device. | 08-15-2013 |
20130215005 | METHOD FOR ADAPTIVE INTERACTION WITH A LEGACY SOFTWARE APPLICATION - Methods are disclosed to support adaptive interaction with legacy software applications, without a need for rewriting those applications. The methods are for use with an interactive electronic system including a processor, a display, and an input device with user-manipulated controls. When the legacy application is executed, a supplemental software program, such as a plugin, is also executed and is utilized in order to identify currently relevant interactive features of the legacy application during execution. Functionality is dynamically assigned to the various user-manipulated controls based on the identified features. In one embodiment, detection of objects (particularly the user's hands) proximate to the input controls is also employed in determining the assignment of functionality and/or in displaying a visual representation to the user of the available interactive choices. In another embodiment, the user-manipulated input controls are dynamically and physically reconfigured under control of the processor based on the identified features. | 08-22-2013 |
20130215006 | METHODS AND APPARATUS FOR AUTOMATIC TV ON/OFF DETECTION - Methods and apparatus are disclosed for automatic TV ON/OFF detection. An example method includes detecting a power state of an information presentation device. The method includes comparing, using a processor, a measurement indicative of an amount of power drawn by the information presentation device to a first threshold. The method also includes comparing the measurement to a second threshold. The method also includes storing an indication that the information presentation device is in an indeterminate state if the measurement is greater than the first threshold and less than the second threshold. | 08-22-2013 |
20130215007 | PORTABLE ELECTRONIC DEVICE AND CONTROL METHOD THEREOF - A portable electronic device including a control unit, an ambient light sensor, a proximity sensor and a panel is provided. The ambient light sensor is electrically connected to the control unit. The proximity sensor is electrically connected to the control unit. The panel is electrically connected to the control unit. When the portable electronic device is in a normal operation state and the panel is in an enabled state, if the ambient light sensor detects that the ambient light is dark and the proximity sensor detects that an object is close to a predetermined distance, the control unit switches the portable electronic device into one of a sleep mode, a hibernation mode or a shutdown mode, otherwise, the portable electronic device maintains in the normal operation state. | 08-22-2013 |
20130215008 | PORTABLE ELECTRONIC DEVICE AND CONTROL METHOD THEREOF - A portable electronic device including a base, a lid and a hinge is provided. The base has a control unit and a proximity sensor. The proximity sensor is electrically connected to the control unit. The lid has a panel electrically connected to the control unit. The hinge is used for connecting the base with the lid. When the proximity sensor detects that the angle between the lid and the base exceeds a predetermined angle, the control unit performs a predetermined operation. | 08-22-2013 |
20130215009 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND A COMPUTER PROGRAM PRODUCT - An information processing apparatus, method and computer program product determine an object range to be displayed on a display based on a detected user-related action. For the apparatus, a control unit determines content to be displayed within an object range on a map. An action recognition processing unit that detects a user-related action. The control unit determines the content to be displayed within the object range. The object range includes a current position of the information processing apparatus, and a coverage area of the object range is based on the user-related action detected by the action recognition processing unit. | 08-22-2013 |
20130215010 | PORTABLE ELECTRONIC EQUIPMENT AND METHOD OF VISUALIZING SOUND - A portable electronic equipment comprises an optical output device and a controller configured to receive a sound signal and visual environment data, the visual environment data representing an environment of the portable electronic equipment. The controller is configured to process the received sound signal to identify sound characteristics of the sound signal, to generate graphics based on both the received visual environment data and the identified sound characteristics, and to control the optical output device to output the generated graphics, wherein a location at which the generated graphics is output on the optical output device is controlled based on the received visual environment data. | 08-22-2013 |
20130215011 | ELECTRONIC DEVICE AND METHOD FOR CONTROLLING THE SAME - An electronic device and a method for controlling the same are disclosed. The electronic device has a flexible display screen, and the flexible display screen has a first display region. The method for controlling the electronic device includes: detecting whether the flexible display screen is bent to generate detection information; and dividing a first display region into a first display sub-region and a second display sub-region in the case that the detection information indicates that the flexible display screen is bent, where the first display sub-region is independent of the second display sub-region. | 08-22-2013 |
20130215012 | UNDERWATER IMAGE PROJECTION DISPLAY SYSTEM, LIGHTING CONTROL SYSTEM AND DEVICE AND METHOD OF OPERATING SAME - An underwater image projection system submerged in a body of water and projecting an image within said body of water is provided having an enclosure with a lens assembly. A projection element has a light source projecting an image within the body of water from the projection element with an at least one light source steering the image. A system controller is coupled to and controls the projected light source, the projected light source steering device and the further image steering device. A user inputs image data to the controller through the user input device and the controller interprets the image data into a set of image control variables and executes control of the projected light source and image source and further image steering device in coordination and projects the image through the projection element to project from underwater a static or animated image on an underwater surface of the body of water. | 08-22-2013 |
20130215013 | MOBILE COMMUNICATION TERMINAL AND METHOD OF GENERATING CONTENT THEREOF - A mobile communication terminal and method thereof capable of generating content data according to a synchronization scheme suitable for a mobile environment are provided, which allow a user to simply create and share content. The method includes receiving a user input or selection instruction for a plurality of content; determining whether there is sound data among the plurality of content; and if there is sound data, generating content data by synchronizing first content to be displayed while the sound data is played among the plurality of content to first segment data. | 08-22-2013 |
20130215014 | CAMERA BASED SENSING IN HANDHELD, MOBILE, GAMING, OR OTHER DEVICES - Method and apparatus are disclosed to enable rapid TV camera and computer based sensing in many practical applications, including, but not limited to, handheld devices, cars, and video games. Several unique forms of social video games are disclosed. | 08-22-2013 |
20130215015 | SYSTEM AND METHODS FOR ENHANCED REMOTE CONTROL FUNCTIONALITY - A hand-held device having a touch sensitive surface uses a relative distance from an origin location to each of a plurality of touch zones of the touch sensitive surface activated by a user to select a one of the plurality of touch zones as being intended for activation by the user. | 08-22-2013 |
20130215016 | DYNAMIC IMAGE DISTRIBUTION SYSTEM, DYNAMIC IMAGE DISTRIBUTION METHOD AND DYNAMIC IMAGE DISTRIBUTION PROGRAM - [Problem] To provide a dynamic image distribution system, dynamic image distribution method, and dynamic image distribution program enabling the arbitrary setting of a viewing range for compression-encoded dynamic images and the interactive modification of the viewing range. | 08-22-2013 |
20130215017 | METHOD AND DEVICE FOR DETECTING GESTURE INPUTS - A method is provided for detecting gesture inputs in response to a consecutive reciprocating movement before a detecting device, wherein, the consecutive reciprocating movement is made of a first type of gesture and a second type of gesture, each capable of being recognized by the detecting device to output a different control signal. The method comprises the steps of receiving the consecutive reciprocating movement starting with a first type of gesture among the two types, wherein, the first type of gesture and the second type of gesture occur alternately, and outputting control signals corresponding to the first type of gesture with times number equaling to the number of the first of type gesture contained within the consecutive reciprocating movement. | 08-22-2013 |
20130222222 | METHOD AND APPARATUS FOR PRESENTING MULTI-DIMENSIONAL REPRESENTATIONS OF AN IMAGE DEPENDENT UPON THE SHAPE OF A DISPLAY - A method, apparatus and computer program product are provided to cause different multi-dimensional representations of an image to be presented upon a display and to facilitate changing from one multi-dimensional representation of the image to another multi-dimensional representation of the image, such as in response to a change in the shape of the display. In the context of a method, a first, multi-dimensional representation of an image is caused to be presented upon the display while the display has a first shape. The method also causes a second, multi-dimensional representation of the image, different than the first, multi-dimensional representation of the image, to be presented upon the display while the display has a second shape, different than the first shape. The method also determines the shape of the display such that the corresponding representation of the image is caused to be presented in response thereto. | 08-29-2013 |
20130222223 | METHOD AND APPARATUS FOR INTERPRETING A GESTURE - A method, apparatus and computer program product are provided to facilitate user interaction with a display that is capable of presenting at least portions of the user interfaces of multiple devices, such as by recognizing and interpreting a gesture as providing input to one of the devices. In the context of a method, an identification of one or more valid gestures of at least a first device is received in an instance in which a plurality of devices interact such that portions of the respective user interfaces are capable of being presented upon a display. The method also includes receiving information indicative of a gesture and determining whether the gesture is valid. Depending upon whether the gesture is a valid gesture, the method also includes causing an indication of the gesture to be provided to the first device. | 08-29-2013 |
20130222224 | Device, System and Method for Generating Display Data - Device for generating display data comprises memory configured to store application data, including application data pertaining to a first application executable on the device; an input interface configured to receive input; a processor in communication with the memory and input interface. The processor is configured to communicate with first and second display devices, generate and output first display data for display on the first display device, and generate and output second display data for display on the second display device, in accordance with the input. In a first mode, the processor is configured to generate the first and second display data based on the application data pertaining to the first application. In a second mode, the processor is configured to generate the first display data based on the application data, and to generate the second display data so as to include preconfigured display data for display on the second device. | 08-29-2013 |
20130222225 | Device, System and Method for Generating Display Data - A device for generating display data comprises memory configured to store application data defining one or more display objects; an input interface configured to receive input; and a processor in communication with the memory and input interface and configured to output display data in accordance with the input. The processor is further configured to modify the application data according to a predetermined code to symbolise an operating status of the device in the display data; and to generate the display data including components which are representative of the display objects defined by the modified application data before outputting the display data. | 08-29-2013 |
20130222226 | USER INTERFACES AND ASSOCIATED APPARATUS AND METHODS - An apparatus including at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following: in response to detecting a particular non-selecting user input spatially associated with a tactile user interface of an electronic device, provide for changing a region of the tactile user interface from a first configuration to a second configuration, the tactile user interface region comprising one or more user interface elements, wherein when the tactile user interface region is in the second configuration, the user interface elements of the tactile user interface region have a different depth aspect than when the tactile user interface region is in the first configuration. | 08-29-2013 |
20130222227 | METHOD AND APPARATUS FOR INTERCONNECTED DEVICES - A computer implemented method performed by an electronic device connected to a plurality of other devices. The electronic device comprises a display and an input device for receiving user input. The method comprises receiving a predefined user input at the electronic device, causing the display of information at the plurality of connected devices in response to receiving the predefined user input, and preventing the plurality of connected devices from causing the information to not be displayed. | 08-29-2013 |
20130222228 | AUTOMATIC PROJECTOR BEHAVIOUR CHANGES BASED ON PROJECTION DISTANCE - There is disclosed one or more methods, systems and components therefor for varying the behaviour of a projector. The operations for varying the behaviour from a first behaviour to a second behaviour responsive to a change in projection distance from a first distance to a second distance, the behaviours comprising a control mode or a presentation mode. | 08-29-2013 |
20130222229 | DISPLAY CONTROL APPARATUS, DISPLAY CONTROL METHOD, AND CONTROL METHOD FOR ELECTRONIC DEVICE - According to one embodiment, a control method for an electronic device includes: encoding image data displayed on a first screen of a display; first transmitting the image data encoded; generating operation image data, displayed on a screen of another device, corresponding to data in response to an input operation performed on an operation module provided on the first screen in an overlapping manner; and second transmitting the operation image data generated, wherein the generating includes generating first operation image data, indicative of an input operation at a second point later than a first point in time, displayed in a superposed manner on a first image displayed on the first screen at the first point, and second operation image data, indicative of an input operation at the second point, displayed in a superposed manner on a second image displayed on the first screen at the second point. | 08-29-2013 |
20130222230 | MOBILE DEVICE AND METHOD FOR RECOGNIZING EXTERNAL INPUT - A. mobile device includes a plurality of microphones to recognize a sound generated from an external input, a sensor to recognize an impulse generated from the external input, and a processor. The processor determines multiple regions around the mobile device, determines whether the external input is generated in a region among the multiple regions based on the recognized sound and the impulse, and executes an instruction corresponding to the region. A method that uses a processor to recognize an external input includes recognizing a sound generated from an external input, recognizing an impulse generated from the external input, determining, using the processor, a location of the external input around a mobile device based on the recognized sound and the impulse, and executing an instruction corresponding to the location of the external input. | 08-29-2013 |
20130222231 | HANDHELD DEVICE WITH NOTIFICATION MESSAGE VIEWING - Displaying information related to a message on a display screen of a handheld electronic device. In response to a movement of the handheld electronic device from an initial position to a first orientation, displaying on the display screen a first notice related to the message. | 08-29-2013 |
20130222232 | GESTURE RECOGNITION DEVICE AND METHOD THEREOF - A device and a method having a gesture recognition operation at a distance are provided. The device to recognize a gesture of a user includes an image capture unit to capture a gesture to acquire image information, a control unit to determine a distance between the device and the user based on the image information, and to determine a mode of the device according to the determined distance. The method for recognizing a gesture for a device includes capturing a gesture of a user as image information, determining a distance between the device and the user based on the image information, and determining a mode of operation according to the determined distance. | 08-29-2013 |
20130222233 | SYSTEM AND METHOD FOR IMPLEMENTING 3-DIMENSIONAL USER INTERFACE - A system for implementing a 3-dimensional (3D) user interface includes an input device configured to collect position and normal direction information of a plurality of points located on a 3D object, a calculation device configured to process the position and normal direction information collected by the input device, and an output device configured to output a 3D virtual space set by the calculation device. The calculation device processes the position and normal direction information of the plurality of points, sets a plurality of virtual points corresponding to the plurality of points in the virtual space, and forms a 3D selection region including the plurality of virtual points in the virtual space, and the shape of the selection region is changed in correspondence with change in the position and normal direction of the plurality of virtual points according to shape change of the 3D object. | 08-29-2013 |
20130222234 | IMAGE DISPLAY APPARATUS - The present disclosure realizes an image display apparatus that can make the emission angle range of emitted light bilaterally symmetrical, at low cost, irrespective of the material refractive index of a prism composing a liquid crystal prism. An image display apparatus | 08-29-2013 |
20130222235 | GAZE DETECTING HEADS-UP DISPLAY SYSTEMS - A display system for a head-mounted device comprising a display for displaying information. The display system includes a glance detector comprising a light source, disposed proximate the display, for transmitting light toward an eye of a user of the head-mounted device, a light detector, disposed proximate the light source, for detecting light reflected from the eye of the user and generating a voltage based on the detected light. The display system also includes a processor operably coupled to the display, and the gaze detector and configured to control the display to turn-on the display based on the voltage received from the glance detector. | 08-29-2013 |
20130222236 | HANDHELD DEVICE WITH NOTIFICATION MESSAGE VIEWING - Displaying information related to a message on a display screen of a handheld electronic device. In response to a movement of the handheld electronic device from an initial position to a first orientation, displaying on the display screen a first notice related to the message. | 08-29-2013 |
20130222237 | INTERACTIVE POLARIZATION-PRESERVING PROJECTION DISPLAY - The disclosure generally relates to optical devices, such as interactive displays, and in particular to interactive projection displays having passive input devices. The present disclosure also provides a passive interactive input device having the ability to overcome problematic ambient interference signals in an interactive display, such as an interactive projection display. | 08-29-2013 |
20130229330 | CONTROLLING IMAGES AT HAND-HELD DEVICES - Controlling images at hand-held devices using sensor input, for example, as detected by one or more orientation sensors in a hand-held computing device is described. In various embodiments images are displayed at a hand-held computing device according to orientation sensor readings observed at the device and before user input is received at the images. For example, two or more images with different opacities are superimposed and the opacities differentially varied as the hand-held device is tilted. In another example images are placed in a 3D space which is rotated as the device is tilted. In another example, a video is played either forwards or backwards according to an orientation of the device. In various examples the images are displayed as part of a web page by using a template in the web page to control the display of images according to sensor readings. | 09-05-2013 |
20130229331 | METHOD AND APPARATUS FOR DETERMINING AN OPERATION BASED ON AN INDICATION ASSOCIATED WITH A TANGIBLE OBJECT - An apparatus, method, and computer program product are described that can receive a signal from a tangible object, where the signal includes an indication of an operation to be executed upon receipt of a user input applied by the tangible object, and that can determine the operation based on the indication. In this way, the user is not required to provide additional user input to define the particular operation that is desired. The apparatus may determine at least one recipient with whom data selected via a user input applied by the tangible object may be shared. The apparatus may further provide for the determination of an operation to be executed based on an identification of a selection of content and may provide for the association of the operation with the tangible object, such that subsequent user input applied via the tangible object causes execution of the operation. | 09-05-2013 |
20130229332 | ASSOCIATING STROKES WITH DOCUMENTS BASED ON THE DOCUMENT IMAGE - A method and apparatus is disclosed herein for associating strokes with a document image. In one embodiment, the method comprises capturing strokes written on a screen over a first document image while the document image is being displayed, associating captured stroke data of the captured strokes with underlying image patches of the document image being displayed, determining that a second document image is being displayed on the screen, determining whether one or more image patches of the second document image, or parts thereof, had previously been associated with captured stroke data, and drawing one or more previously captured strokes, or portions thereof, on image patches of the second document image based on results of determining whether one or more image patches of the second document image, or parts thereof, had previously been associated with captured stroke data. | 09-05-2013 |
20130229333 | AUTOMATIC ENDING OF INTERACTIVE WHITEBOARD SESSIONS - A method and apparatus is disclosed herein for automatically ending an interactive device session. In one embodiment, the system comprises a memory; and a processor coupled to the memory and operable to: log out a user, delete locally stored data created during a session, and place one or more system hardware components in a reduced power consumption state based on: occupancy sensor data indicating presence or absence of one or more individuals in proximity to an occupancy sensor, and activity information associated with a display surface. | 09-05-2013 |
20130229334 | PORTABLE DEVICE AND CONTROL METHOD THEREOF - A portable device which is capable of communicating with an external device and a control method thereof are discussed. A method for transmitting user input of a portable device includes detecting navigating input in a navigation mode for controlling an external device which is connected by a network, wherein the external device displays displayable content and the navigating input is for navigating the displayable content which includes at least one input box, transmitting a control signal corresponding to the detected navigating input to the external device, displaying an indicator which indicates capability of mode switching from the navigation mode to a user input mode, initiating the user input mode by displaying the input box which is extracted from the displayable content when user input for mode switching is detected, and transmitting user input which is received through the displayed input box. | 09-05-2013 |
20130229335 | SENSING USER INPUT AT DISPLAY AREA EDGE - One or more sensors are disposed to sense user inputs in an active display area as well as user inputs in an extended area that is outside of the active display area. Data for user inputs, such as gestures, may include data from user inputs sensed in both the active display area and outside of the active display area. The user inputs can begin and/or end outside of the active display area. | 09-05-2013 |
20130229336 | STEREOSCOPIC IMAGE DISPLAY DEVICE, STEREOSCOPIC IMAGE DISPLAY METHOD, AND CONTROL DEVICE - According to one embodiment, a stereoscopic image display device includes a display element in which a plurality of pixels are arranged in a matrix topology, an optical element coupled to the display element, the optical element having variable optical characteristics. The device also includes an acquirer, calculator, and controller. The acquirer is configured to acquire person's information including a position of each of at least one person viewing a stereoscopic image. The calculator is configured to calculate, based on the person's information, a weight representing a quality of stereoscopic viewing for each person. The controller is configured to select optical characteristic parameters corresponding to the weight, and control the optical characteristics of the optical element based on the optical characteristic parameters. | 09-05-2013 |
20130229337 | ELECTRONIC DEVICE, ELECTRONIC DEVICE CONTROLLING METHOD, COMPUTER PROGRAM PRODUCT - According to one embodiment, an electronic device includes: a display controller; a user presence determination module; a reference elapsed time storage module; an interval time setting module; and a display controller. The display controller causes a display to display various types of information. The user presence determination module determines whether a user is present at a first time interval. The reference elapsed time storage module stores therein a second time for changing an operation mode to a power saving mode based on a time elapsed from when a user operation is ceased to be detected. The interval time setting module sets the first time interval shorter than the second time based on the second time stored in the reference elapsed time storage module. The display controller turns off the display if the user presence determination module determines that the user is absent. | 09-05-2013 |
20130229338 | TEXTILE INTERFACE DEVICE AND METHOD FOR USE WITH HUMAN BODY-WORN BAND - Disclosed herein is a textile interface device and method for use with a human body-worn band. The textile interface device includes a detection unit provided in wearing means that is worn on a human body and configured to detect bio-signals from the human body, an interface unit disposed inside accommodation means provided on one side of the wearing means and configured to communicate with an electronic device accommodated in the accommodation means, and a plurality of textile buttons configured to generate control signals adapted to control the electronic device. The interface unit includes, on a textile, an optical reception unit configured to receive optical signals, an optical transmission unit configured to send the optical signals, and a light diffusion unit configured to diffuse light when the optical signals are sent. The interface unit communicates with the electronic device by means of light. | 09-05-2013 |
20130229339 | INPUT DEVICE - There is provided an input device, including: a base; an operation portion; and an oscillatory wave motor including a stator and a rotor, the rotor thrusting the stator, the oscillatory wave motor providing haptic feedback to an operator via the operation portion. Displacement of one of the stator and the rotor from the base in an axial direction is allowed, displacement of the one of the stator and the rotor in a rotational direction is restricted, and the operation portion includes the other one of the stator and the rotor. | 09-05-2013 |
20130229340 | MULTIMEDIA INTERACTION SYSTEM AND RELATED COMPUTER PROGRAM PRODUCT CAPABLE OF AVOIDING UNEXPECTED INTERACTION BEHAVIOR - A multimedia interaction system is disclosed, including: a plurality of member electronic devices; a plurality of displays respectively arranged on the member electronic devices; and a location detection circuit configured to operably detect respective member electronic device's spatial location and orientation dynamically and to transmit detection results to at least one of the member electronic devices. When an user instructs a source electronic device of the member electronic devices to transmit a target image object toward a target direct, the source electronic device transmits a target command corresponding to the target image object to a candidate electronic device of the member electronic devices to perform corresponding multimedia interaction operations only if a relative position between the candidate electronic and the target direct satisfies a predetermined condition. | 09-05-2013 |
20130229341 | HANDWRITING INPUT DEVICE AND COMPUTER-READABLE MEDIUM - A position detecting unit is configured to detect an access position and a contact position of an operation tool. A guide figure display unit is configured to, when an access position of the operation tool is detected by the position detecting unit, display a guide figure for handwriting input on the display unit based on the detected access position. A handwriting position display unit is configured to, when a contact position of the operation tool is detected by the position detecting unit in a state where the guide figure is displayed by the guide figure display unit, display a locus of the detected contact position on the display unit. | 09-05-2013 |
20130229342 | INFORMATION PROVIDING SYSTEM, INFORMATION PROVIDING METHOD, INFORMATION PROCESSING APPARATUS, METHOD OF CONTROLLING THE SAME, AND CONTROL PROGRAM - An apparatus of this invention is an information processing apparatus for providing information to general public. This information processing apparatus displays a screen including an inducement image to induce a hand motion. The hand motions of persons in the sensed public are recognized. According to the feature of this invention, out of the persons in the sensed public, a person whose recognized hand motion corresponds to the hand motion to be induced by the inducement image is identified. The identified person is set as the advertising target person, thereby producing an opportunity for persons to pay attention to advertising information. | 09-05-2013 |
20130229343 | DISABLING AN AUTOMATIC ROTATION FUNCTION OF MOBILE COMPUTING DEVICES - Technology is generally described for disabling an automatic rotation function of mobile computing devices. The technology can detect a tilt angle of a display of the computing device in relation to a surface; and if the tilt angle is less than a specified threshold tilt angle, disable a rotation function. | 09-05-2013 |
20130229344 | SYSTEMS AND METHODS FOR HAND GESTURE CONTROL OF AN ELECTRONIC DEVICE - Systems and methods of generating device commands based upon hand gesture commands are disclosed. An exemplary embodiment generates image information from a series of captured images, generates commands based upon hand gestures made by a user that emulate device commands generated by a remote control device, identifies a hand gesture made by the user from the received image information, determines a hand gesture command based upon the identified hand gesture, compares the determined hand gesture command with the plurality of predefined hand gesture commands to identify a corresponding matching hand gesture command from the plurality of predefined hand gesture commands, generates an emulated remote control device command based upon the identified matching hand gesture command, and controls the media device based upon the generated emulated remote control device command. | 09-05-2013 |
20130234924 | Portable Electronic Device and Method for Controlling Operation Thereof Based on User Motion - A portable electronic device includes a motion sensor and a controller. The motion sensor detects an alternating signature motion of a limb of the user about a virtual axis corresponding to the limb. The motion sensor may be an accelerometer capable of detecting three dimensional acceleration. The accelerometer detects acceleration along X, Y and/or Z axes, in which acceleration peaks of the X and Z axes alternate with each other and acceleration of the Y axis remains substantially steady relative to the X and Y axes. The portable electronic device controls at least one function based on the detected alternating signature motion of the limb and/or acceleration along the X, Y and/or Z axes. | 09-12-2013 |
20130234925 | METHOD AND APPARATUS FOR PERFORMING AN OPERATION AT LEAST PARTIALLY BASED UPON THE RELATIVE POSITIONS OF AT LEAST TWO DEVICES - A method, apparatus and computer program product are provided in order to facilitate the provision of user input intended to cause an operation to be performed. In the context of a method, a position of a second device relative to a first device is determined. The position of the second device is non-overlapping relative to the first device. The method also performs an operation, with a processor, based at least partially upon the position of the second device relative to the first device. A corresponding apparatus and computer program product are also provided. | 09-12-2013 |
20130234926 | VISUALLY GUIDING MOTION TO BE PERFORMED BY A USER - Motion to be performed on a device by a user is visually guided, by displaying at least one icon on a screen of the device. The icon when displayed initially has an attribute whose value is indicative of a predetermined movement to be performed on the device. The user responds to the icon's display by moving the device in the real world in an attempt to perform the predetermined movement, in whole or in part. The displayed icon is then re-displayed with a revised value of the attribute to indicate an instantaneous to-be-performed movement. The instantaneous to-be-performed movement depends on the predetermined movement and a measurement of actual movement of the handheld device, after the initial display. The re-display of the icon is performed repeatedly, to change the display of the icon's attribute based on at least the predetermined movement and additional measurements of additional movements of the handheld device. | 09-12-2013 |
20130234927 | ELECTRONIC DEVICE FOR MEASURING ANGLE OF FACE AND ROTATING SCREEN THEREOF AND METHOD THEREOF - An electronic device is configured to measure an angle of a user's face and rotate a screen thereof. In a method, the electronic device verifies that a face of a user is included in photographed image information, recognizes the face of the user included in the image information, and rotates a screen of the electronic device according to an angle of the recognized face. | 09-12-2013 |
20130234928 | APPARATUS AND METHOD FOR CONTROLLING SCREEN - An apparatus for controlling a screen includes a data reception unit configured to receive data and one or more events to be displayed on a screen; and a user input unit configured to provide input information attributable to manipulation of a user input device. Further, the apparatus for controlling the screen includes a screen control unit configured to determine whether to update or switch the screen based on the events and the input information. | 09-12-2013 |
20130234929 | ADAPTING MOBILE USER INTERFACE TO UNFAVORABLE USAGE CONDITIONS - Adapting a mobile user interface to unfavorable usage conditions includes detecting undesirable motion of the mobile device and providing adaptations to the mobile device user interface according to the undesirable motion, where the adaptations include at least one of: enlarging graphical elements of the mobile device, providing digital stabilization of images on the mobile device, providing additional warnings and user input options for critical operations, using modified gesture recognition algorithms, and adjusting system response to typing and drawing. The undesirable motion may be momentary or persistent. The adaptations that are provided may vary according to whether the undesirable motion is momentary or persistent. Undesirable motion that is momentary may be a bump, a dive and/or a sharp road turn. Undesirable motion that is persistent may include railroad vibration, plane vibration, and/or vessel pitching. The undesirable motion may be categorized by intensity as low, medium and high intensity. | 09-12-2013 |
20130234930 | SCANNING MIRROR LASER AND PROJECTOR HEAD-UP DISPLAY GLASSES - This application consists of a method for wearable heads-up display devices to display image or video that does not require the user to focus his eyes on the display of the device. | 09-12-2013 |
20130234931 | USER INTERFACE FOR GESTURE-BASED CONTROL INPUT AND RELATED METHOD - An electronic device for visualizing data and receiving related gesture-based control input from a user, configured to obtain digital image data utilizing a number of camera entities and to derive the control input on the basis of the image data, the electronic device including a display panel for displaying data to a user, and at least one protective element integrated with the display panel and including, as disposed at the periphery region around the active area of the display panel, the number of camera entities substantially embedded therein, the protective element including material that is optically substantially transparent relative to the predetermined reception wavelengths of the optically sensitive areas of the camera entities and substantially covers the sensitive areas, the camera entities in the protective element to span at least partially overlapping fields of view substantially in front of the display panel. A corresponding method of manufacture is presented. | 09-12-2013 |
20130234932 | INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING SYSTEM CONTROL METHOD, INFORMATION PROCESSING APPARATUS, AND STORAGE MEDIUM - An information processing system determines a relationship between an observer and an observation target person based on personal information of the observer and personal information of the observation target person, and displays content information generated based on the determined relationship together with an image of the observation target person in a display area of a display unit. | 09-12-2013 |
20130234933 | COHERENT PRESENTATION OF MULTIPLE REALITY AND INTERACTION MODELS - A method for navigating concurrently and from point-to-point through multiple reality models is described. The method includes: generating, at a processor, a first navigatable virtual view of a first location of interest, wherein the first location of interest is one of a first virtual location and a first non-virtual location; and concurrently with the generating the first navigatable virtual view of the first location of interest, generating, at the processor, a second navigatable virtual view corresponding to a current physical position of an object, such that real-time sight at the current physical position is enabled within the second navigatable virtual view. | 09-12-2013 |
20130234934 | Three-Dimensional Collaboration - Remote collaboration of a subject and a graphics object in a same view of a 3D scene. In one embodiment, one or more cameras of a collaboration system may be configured to capture images of a subject and track the subject (e.g., head of a user, other physical object). The images may be processed and provided to another collaboration system along with a determined viewpoint of the user. The other collaboration system may be configured to render and project the captured images and a graphics object in the same view of a 3D scene. | 09-12-2013 |
20130234935 | DISPLAY ASSEMBLY - A display assembly, comprising: a display device ( | 09-12-2013 |
20130241817 | DISPLAY DEVICE AND METHOD FOR ADJUSTING CONTENT THEREOF - A displayer includes a display unit, a camera and a control unit. The display unit is configured to display video information. The camera is configured to obtain image information in front of the display unit. The control unit receives the image information of the camera, and adjusts the content of the display unit according to the image information of the camera. | 09-19-2013 |
20130241818 | TERMINAL, DISPLAY DIRECTION CORRECTING METHOD FOR A DISPLAY SCREEN, AND COMPUTER-READABLE RECORDING MEDIUM - A display screen that is displayable in a direction corresponding to a direction of a body of a user. A terminal including a gyrosensor that detects a direction of the terminal, a camera that is provided in order to image a user who views the display screen, a face detection processor that detects a direction of the user imaged by the camera, and a screen displaying part that decides a display direction of the display screen based on the terminal direction detected by the gyrosensor and the user direction detected by the face detection processor. | 09-19-2013 |
20130241819 | GESTURE RECOGNITION APPARATUS, ELECTRONIC DEVICE, GESTURE RECOGNITION METHOD, CONTROL PROGRAM, AND RECORDING MEDIUM - User's operability is improved by causing a gesture recognition apparatus to recognize repeated operation. A gesture recognition apparatus of the at least one embodiment of the present invention includes a gesture recognition unit for recognizing gesture based on a trajectory of movement of a command body, and identifying a process corresponding thereto, and an execution amount determination unit for determining a processing execution amount of the process to be executed by a processing execution entity, wherein the execution amount determination unit determines the processing execution amount on the basis of a change of a form of the command body. | 09-19-2013 |
20130241820 | PORTABLE PROJECTOR AND IMAGE PROJECTING METHOD THEREOF - A portable projector includes a projector module for projecting image data, an infrared ray output unit which outputs an infrared ray to a projected image data area, a camera unit which photographs the projected image data area, a coordinate value calculator which calculates a coordinate value corresponding to each coordinate value of the photographed projected image data area, and a controller which determines whether there is a pen or a finger input in the projected image data area according to the calculated coordinate value and performing a control for updating the image data by using an input coordinate value corresponding to the input when there is input by the one of the pen and the finger. | 09-19-2013 |
20130241821 | IMAGE PROCESSING SYSTEM, IMAGE PROCESSING METHOD, AND STORAGE MEDIUM STORING IMAGE PROCESSING PROGRAM - This invention relates to an image processing apparatus that displays an image for plural persons and has a higher operationality for a person who is viewing the image. The apparatus includes an image display unit that displays an image, a sensing unit that senses an image of plural persons gathered in front of the image display unit, a gesture recognition unit that recognizes, from the image sensed by the sensing unit, a gesture performed by each of the plural persons for the image displayed on the image display unit, and a display control unit that makes a display screen transit based on a recognized result by the gesture recognition unit. | 09-19-2013 |
20130241822 | Enabling Physical Controls on an Illuminated Surface - A method for enabling physical controls in a digital system is provided that includes receiving an image of an illuminated surface in the digital system, wherein the image is captured by a camera in the digital system, determining a state of a physical control mounted on the illuminated surface by analyzing the image; and outputting an indication of the state of the physical control. | 09-19-2013 |
20130241823 | METHOD FOR PROVIDING HUMAN INPUT TO A COMPUTER - The invention provides a method for providing human input to a computer which allows a user to interact with a display connected to the computer. The method includes the steps of placing a first target on a first portion of the user's body, using an electro-optical sensing means, sensing data related to the location of the first target and data related to the location of a second portion of the user's body, the first and second portions of the user's body being movable relative to each other, providing an output of the electro-optical sensing means to the input of the computer, determining the location of the first target and the location of the second portion of the user's body, and varying the output of the computer to the display based upon the determined locations for contemporaneous viewing by the user. | 09-19-2013 |
20130241824 | INTEGRATED PROCESSOR FOR 3D MAPPING - A device for processing data includes a first input port for receiving color image data from a first image sensor and a second input port for receiving depth-related image data from a second image sensor. Processing circuitry generates a depth map using the depth-related image data. At least one output port conveys the depth map and the color image data to a host computer. | 09-19-2013 |
20130241825 | SYSTEM AND METHOD FOR ENHANCED COMMAND INPUT - A portable electronic device having an input device for receiving a gesture based input from a user is used to control a navigation operation of an appliance. The portable electronic device receives via the input device the gesture based input and uses one or more parameters stored in a memory of the portable electronic device and one or more characteristics associated with the gesture based input to cause the portable electronic device to transmit a navigation step command to thereby control the navigation operation of the appliance. | 09-19-2013 |
20130241826 | Three-Dimensional Interface System and Method - A three-dimensional virtual-touch human-machine interface system ( | 09-19-2013 |
20130249783 | METHOD AND SYSTEM FOR ANNOTATING IMAGE REGIONS THROUGH GESTURES AND NATURAL SPEECH INTERACTION - The invention relates to a method and system for annotating image regions with specific concepts based on multimodal user input. The system ( | 09-26-2013 |
20130249784 | Method and Device for Pose Tracking Using Vector Magnetometers - In accordance with various embodiments of the invention, a user-borne computer input device is disclosed including a 5D or greater “mouse” or other tool or object containing or consisting of one or more permanent magnetic presenting permanent magnetic dipoles. The position and orientation of the device are determined by magnetic field strength measurements derived from at least two vector-magnetometers fixed in a reference frame connected to the computing device. The system allows a user to interact with the computing device in at least 5 dimensions by manipulating the position and orientation of the device within a measurement volume, which may be a few cubic meters. | 09-26-2013 |
20130249785 | METHOD FOR PREVENTION OF FALSE GESTURE TRIGGER INPUTS ON A MOBILE COMMUNICATION DEVICE - A method for prevention of false gesture trigger inputs on a mobile communication device is disclosed herein. The method includes providing a relative positioning sensor output to a controller for enabling/disabling or adaptively adjusting detection of gesture inputs on the mobile communication device based on an angular position or motion of the mobile communication device relative to a directional trigger beam or alternatively relative to environmental conditions impacting the mobile communication device. | 09-26-2013 |
20130249786 | GESTURE-BASED CONTROL SYSTEM - A method and system for human computer interaction using hand gestures is presented. The system permits a person to precisely control a computer system without wearing an instrumented glove or any other tracking device. In one embodiment, two cameras observe and record images of a user's hands. The hand images are processed by querying a database relating hand image features to the 3D configuration of the hands and fingers (i.e. the 3D hand poses). The 3D hand poses are interpreted as gestures. Each gesture can be interpreted as a command by the computer system. Uses for such a system include, but are not limited to, computer aided design for architecture, mechanical engineering and scientific visualization. Computer-generated 3D virtual objects can be efficiently explored, modeled and assembled using direct 3D manipulation by the user's hands. | 09-26-2013 |
20130249787 | HEAD-MOUNTED DISPLAY - According to an illustrative embodiment, a head-mounted display is provided. The head-mounted display includes a casing having an opening portion; and a movable member movable between a first position in which the movable member covers the opening portion, and a second position in which the movable member does not cover the opening portion. | 09-26-2013 |
20130249788 | INFORMATION PROCESSING APPARATUS, COMPUTER PROGRAM PRODUCT, AND PROJECTION SYSTEM - An information processing apparatus includes: a role storage unit that stores therein motion information concerning a motion of a user and a role of the user in a manner associated with each other; a detecting unit that detects a motion of a user; a determining unit that determines a role corresponding to motion information concerning a motion of a user detected by the detecting unit, based on the role storage unit; and an operation authorization unit that gives a user operating authority for an execution target device to be instructed to perform an operation by a predetermined motion of the user, the operating authority corresponding to a role of the user determined by the determining unit. | 09-26-2013 |
20130249789 | Electronic Device And Control Method - An electronic device and the control method thereof are described. The electronic device includes a panel provided at a first outer surface of the electronic device; an instruction generating unit configured to generate setting instructions; a position control unit overlapping the panel and configured to control the relative position of at least one part of the operation object on the panel with respect to the panel. | 09-26-2013 |
20130249790 | INPUT USER INTERFACE DEVICE, PROJECTING DEVICE, COMMAND DECIDING METHOD AND PROGRAM STORAGE MEDIUM STORING COMMAND DECIDING METHOD PROGRAM - An input user interface device includes an image pickup unit which picks up an image including an object image, a shape detection unit which detects a shape of the object image in the image picked up by the image pickup unit, a determination unit which determines whether the shape of the object image detected by the shape detection unit corresponds to a shape associated with a gesture command or a shape associated with a position designation command for designating a position data, and a command deciding unit which decides the gesture command or the position designation command on the basis of the decision by the deciding unit. | 09-26-2013 |
20130249791 | INTERACTIVE VIDEO BASED GAMES USING OBJECTS SENSED BY TV CAMERAS - A method and apparatus for interactive TV camera based games in which position or orientation of points on a player or of an object held by a player are determined and used to control a video display. Both single camera and stereo camera pair based embodiments are disclosed, preferably using stereo photogrammetry where multi-degree of freedom information is desired. Large video displays, preferably life-size may be used where utmost realism of the game experience is desired. | 09-26-2013 |
20130249792 | SYSTEM AND METHOD FOR PRESENTING IMAGES - A system includes a mobile device having a display, a sensor, and a processor coupled to the display. The processor can be adapted to obtain three-dimensional (3D) imagery data, create a virtual container around the device according to the 3D data, calibrate the virtual container, select a first portion of an inner surface of the virtual container according to the calibration of the virtual container, present at the display a first image associated with the first portion, wherein the first image is derived from the 3D data, receive sensor data from the sensor, detect from the sensor data a movement by the device, select a second portion of the inner surface according to the detected movement, and present at the display a second image associated with the second portion, wherein the second image is derived from the 3D data. The 360° immersive image adapts to the position of the device. | 09-26-2013 |
20130257709 | Proximity Sensing for Wink Detection - This disclosure relates to proximity sensing for wink detection. An illustrative method includes receiving data from a receiver portion of a proximity sensor. The receiver portion is disposed at a side section of a head-mountable device (HMD). When a wearer wears the HMD, the receiver portion is arranged to receive light reflected from an eye area of the wearer, the proximity sensor detects a movement of the eye area, and the data represents the movement. The method includes determining that the data corresponds to a wink gesture. The method also includes selecting a computing action to perform, based on the wink gesture. The method further includes performing the computing action. | 10-03-2013 |
20130257710 | IMAGE DISPLAY SYSTEM AND METHOD OF DRIVING THE SAME - An image display system includes a sensor part, a camera part, and an image display part. The sensor part is configured to sense a presence of a user and output a sensing signal. The camera part is configured to be controlled by the sensing signal, detect an eye direction of the user, and output a detecting signal. The image display part is configured to be operated in one of first, second, and third modes on the basis of the sensing signal and the detecting signal. The sensor part and the camera part are operated independently from the image display part. Power consumption becomes smaller in order of the first mode, the second mode, and the third mode. | 10-03-2013 |
20130257711 | HANDHELD ELECTRONIC APPARATUS AND INFORMATION RENDERING METHOD - A handheld apparatus device includes an input interface device, a display device, a memory and a processor. The processor, coupled to the input interface device, the display device and the memory, drives the input interface device to provide a configuration command in response to a user configuration event, and determines and stores a data structure into the memory in response to the configuration command. The data structure records to-do list information and display mode information corresponding to the to-do list information. The processor further determines whether an idle event occurs. When the idle event occurs, the processor drives the electronic handheld apparatus to enter a standby mode, in which the processor further selectively drives the display device to perform a standby mode display operation with reference to the display mode information, and the display device accordingly displays the to-do list information with dimmed backlight. | 10-03-2013 |
20130257712 | ELECTRONIC DEVICE - According to one embodiment, electronic device includes: a display; an input module; a housing; a circuit board; a battery; and an antenna. The display includes: a screen; a first surface opposite the screen; and a second surface between the screen and the first surface, and intersecting the display screen and the first surface. The housing includes: a first wall along the first surface and covering the first surface; and a second wall along the second surface intersecting the first wall and covering the second surface. The housing includes: a first member including a first area of the first wall and formed of a metallic material; and a second member including a second area surrounding the first area of the first wall and at least a portion of the second wall, and formed of a synthetic resin material. The antenna overlaps the second member. | 10-03-2013 |
20130257713 | OPERATION DISPLAY DEVICE AND OPERATION DISPLAY METHOD - According to one embodiment, an operation display device includes: display; proximity detectors; icon storage module; arrangement storage module; position acquisition module; and starting module. The proximity detectors each detect proximity of a terminal to the display and acquire identification information of the terminal. The icon storage module stores therein an icon displayed on the display, a computer program, and first position information of the icon on the display, in association with each other. The arrangement storage module stores therein position information of each of the proximity detectors. The position acquisition module acquires second position information of the terminal in proximity to the display. The starting module starts, if the first position information is identical or similar to the second position information, a corresponding program based on the identification information acquired by one of the proximity detectors having the position information closest to the second position information. | 10-03-2013 |
20130257714 | ELECTRONIC DEVICE AND DISPLAY CONTROL METHOD - According to one embodiment, an electronic device includes: a housing; a display device in the housing, the display device comprising a screen; an imaging module in the housing, the imaging module being configured to take an image in front of the screen; an acceleration sensor in the housing; and a display controller configured to control the display device to display an image based on first image data on the screen, and configured to change the image displayed on the screen based on a change with time in second image data taken by the imaging module and a change with time in acceleration data obtained by the acceleration sensor to suppress, when the housing moves relative to at least one viewpoint in front of the screen, a change in appearance of the image displayed on the screen as viewed from the viewpoint. | 10-03-2013 |
20130257715 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM - An information processing apparatus includes a communication unit, a storage, and a controller. The storage is configured to store rule information indicating a rule for calculating points of a user in accordance with a type of an operation by the user. The controller is configured to calculate, based on a detected operation of a first user and the rule information, points of the first user in accordance with a type of the detected operation, to control the storage to store first point information indicating the calculated points, and to control the communication unit to perform one of transmission of the stored first point information to another information processing apparatus and reception of second point information from the other information processing apparatus, the second point information indicating points of a second user and being stored in the other information processing apparatus. | 10-03-2013 |
20130257716 | INTERACTIVE INPUT SYSTEM AND METHOD - A method of operating an interactive input system, comprises detecting user interaction with an interactive surface; acquiring schedule information from a scheduler; and transitioning said interactive input system to an operating mode according to at least one of said user interaction and said schedule information. | 10-03-2013 |
20130257717 | ELECTRONIC DEVICE AND METHOD FOR OPERATING ELECTRONIC DEVICE - Electronic devices and methods for operating the electronic devices are disclosed. The method for operating an electronic device includes a first body and a second body that can be magnetically connected with each other. The method including: obtaining an attachment position at which the first body is attached to the second body; controlling the electronic device to operate in a first mode when the first body is attached to the second body at a first attachment position; and controlling the electronic device to operate in a second mode when the first body is attached to the second body at a second attachment position. The first attachment position is different from the second attachment position, and the first mode is different from the second mode. | 10-03-2013 |
20130257718 | SYSTEM WITH 3D USER INTERFACE INTEGRATION - Disclosed is a system comprising a handheld device and at least one display, where the handheld device is adapted for performing at least one action in a physical 3D environment, where the at least one display is adapted for visually representing the physical 3D environment, and where the handheld device is adapted for remotely controlling the view with which the 3D environment is represented on the display. | 10-03-2013 |
20130257719 | SPHERICAL THREE-DIMENSIONAL CONTROLLER - A three-dimensional control apparatus including a casing, the casing including a first surface and a second surface, the first surface being opposite to the second surface; and a three-dimensional (3D) controller including a first cap actuator, the first cap actuator including a first rounded control surface, at least a portion of the first rounded control surface extending beyond the first surface of the casing; a second cap actuator, the second cap actuator including a second rounded control surface, at least a portion of the second rounded control surface extending beyond the first surface of the casing, the first rounded control surface being aligned with the second rounded control surface; a first sensor to detect force on the first cap actuator; and a second cap sensor to detect force on the second cap actuator. | 10-03-2013 |
20130265216 | MULTI-STATE IMOD WITH RGB ABSORBERS - A display apparatus may include a multi-state IMOD, such as an analog IMOD (AIMOD), a 3-state IMOD (such as having a white state, a black state and one colored state) or a 5-state IMOD (such as having a white state, a black state and three colored states). The multi-state IMOD may include a movable reflective layer and an absorber stack. The absorber stack may include a first absorber layer having a first absorption coefficient and a first absorption peak at a first wavelength, a second absorber layer having a second absorption coefficient and a second absorption peak at a second wavelength, and a third absorber layer having a third absorption coefficient and a third absorption peak at a third wavelength. The first, second and third absorption layers may have absorption levels that drop to nearly zero at the center of each neighboring absorber layer's absorption peak. | 10-10-2013 |
20130265217 | DISPLAY CONTROL SYSTEM, DISPLAY CONTROL METHOD, COMPUTER-READABLE STORAGE MEDIUM HAVING STORED THEREON DISPLAY CONTROL PROGRAM, AND DISPLAY CONTROL APPARATUS - While an example terminal device is being held with its long sides extending horizontally, when a left-right direction is inputted by using a right analog stick, a first virtual camera is directed to a left-right direction in a virtual space. On the other hand, while the terminal device is being held with its long side extending vertically, when an apparent left-right direction, that is, an up-down direction in the state where the terminal device is held with its long sides extending horizontally, is inputted by using a right analog stick, the first virtual camera is directed to the left-right direction in the virtual space. | 10-10-2013 |
20130265218 | GESTURE RECOGNITION DEVICES AND METHODS - Devices and related methods are disclosed herein that generally involve detecting and interpreting gestures made by a user to generate user input information for use by a digital data processing system. In one embodiment, a device includes first and second sensors that observe a workspace in which user gestures are performed. The device can be set to a keyboard input mode, a number pad input mode, or a mouse input mode based on the positioning of the user's hands. Subsequent gestures made by the user can be interpreted as keyboard inputs, mouse inputs, etc., using observed characteristics of the user's hands and various motion properties of the user's hands. These observed characteristics can also be used to implement a security protocol, for example by identifying authorized users by the anatomical properties of their hands or the behavioral properties exhibited by the user while gesturing. | 10-10-2013 |
20130265219 | INFORMATION PROCESSING APPARATUS, PROGRAM, AND INFORMATION PROCESSING METHOD - According to an illustrative embodiment, an information processing apparatus includes an imaging unit; and an image generation unit to generate a display image based on a distance between the imaging unit and an object, wherein the distance is detected by using a plurality of images obtained by the imaging unit at respective focus distances. | 10-10-2013 |
20130265220 | SYSTEM AND METHOD FOR COMBINING THREE-DIMENSIONAL TRACKING WITH A THREE-DIMENSIONAL DISPLAY FOR A USER INTERFACE - Systems and methods for combining three-dimensional tracking of a user's movements with a three-dimensional user interface display is described. A tracking module processes depth data of a user performing movements, for example, movements of the user's hand and fingers. The tracked movements are used to animate a representation of the hand and fingers, and the animated representation is displayed to the user using a three-dimensional display. Also displayed are one or more virtual objects with which the user can interact. In some embodiments, the interaction of the user with the virtual objects controls an electronic device. | 10-10-2013 |
20130265221 | FLEXIBLE DISPLAY APPARATUS AND METHOD FOR PROVIDING UI THEREOF - A flexible display apparatus and control method thereof is provided. The flexible display apparatus may include a display, a sensor that senses one of a first deformed shape of the flexible display apparatus and a second deformed shape of the flexible display apparatus, and a controller which controls the display to display a first user interface (UI) corresponding to the first sensed shape in response to sensing the first shape and a second user interface corresponding to the second sensed shape in response to sensing the second shape, receives an input on the displayed UI, and transmits a control signal to a controlled apparatus that instructs the controlled apparatus to execute a function of the controlled apparatus corresponding to the input. | 10-10-2013 |
20130265222 | Zoom-based gesture user interface - A method includes arranging, by a computer, multiple interactive objects as a hierarchical data structure, each node of the hierarchical data structure associated with a respective one of the multiple interactive objects, and presenting, on a display coupled to the computer, a first subset of the multiple interactive objects that are associated with one or more child nodes of one of the multiple interactive objects. A sequence of three-dimensional (3D) maps including at least part of a hand of a user positioned in proximity to the display is received, and the hand performing a transverse gesture followed by a grab gesture followed by a longitudinal gesture followed by an execute gesture is identified in the sequence of three-dimensional (3D) maps, and an operation associated with the selected object is accordingly performed. | 10-10-2013 |
20130265223 | DATA SERVICES BASED ON GESTURE AND LOCATION INFORMATION OF DEVICE - With the addition of directional information and gesture based input in a location based services environment, a variety of service(s) can be provided on top of user identification or interaction with specific object(s) of interest. For instance, when a user gestures at or points at a particular item, or gestures at a particular location or place, this creates an opportunity, e.g., an advertising opportunity, for anyone having an interest in that particular item or place to communicate with the user regarding that item or related items at a point in time when the user's focus is on the particular item. User context for the interaction can also be taken into account to supplement the provision of one or more interactive direction based services. | 10-10-2013 |
20130265224 | DISPLAY SYSTEM, DISPLAY DEVICE, DISPLAY CONTROL METHOD, AND PHARMACEUTICAL MANAGEMENT SYSTEM - In a display system, a remote control has a first signal production component for producing a first image switching signal, which is a command signal to switch between first image data, which includes confidential information and second image data, which is non-confidential information, in response to a manipulation component being manipulated by a specific user. A display device has a first image switching component that directs switching between the first image data and the second image data displayed on the display device, according to the first image switching signal received from the remote control. When the first image switching signal is inputted, the first image switching component executes first image switching control in which the first image data displayed on the display device is switched to the second image data and the second image data displayed on the display device is switched to the first image data. | 10-10-2013 |
20130265225 | CONTROLLING AND ACCESSING CONTENT USING MOTION PROCESSING ON MOBILE DEVICES - Handheld electronic devices including motion sensing and processing. In one aspect, a handheld electronic device includes a set of motion sensors provided on a single sensor wafer, including at least one gyroscope sensing rotational rate of the device around at least three axes and at least one accelerometer sensing gravity and linear acceleration of the device along the at least three axes. Memory stores sensor data derived from the at least one gyroscope and accelerometer, where the sensor data describes movement of the device including a rotation of the device around at least one of the three axes of the device, the rotation causing interaction with the device. The memory is provided on an electronics wafer positioned vertically with respect to the sensor wafer and substantially parallel to the sensor wafer. The electronics wafer is vertically bonded to and electrically connected to the sensor wafer. | 10-10-2013 |
20130265226 | DISPLAY DEVICE AND METHOD OF PROVIDING FEEDBACK FOR GESTURES THEREOF - An image display device is provided. The image display device comprises a speaker, a camera for receiving a user's gesture, and a controller for outputting sound of the speaker to the user's position. Thereby, feedback for a gesture of the image display device can be efficiently provided. | 10-10-2013 |
20130271359 | MULTI-MONITOR DISPLAY SYSTEM - A multi-monitor display system includes a main monitor, at least one secondary monitor, a main input device, at least one secondary input device, a storage module, and a control module. The storage module stores a plurality of display modes of the main monitor and the secondary monitor, and a table that defining the display contents of the main monitor and the at least one secondary monitor in each display mode, which of the at least one secondary input device is enabled in each display mode, and which of the secondary monitor can be operated by the enabled input device in corresponding display mode. The control controls the main monitor and the at least one secondary monitor to display in a selected mode, and controls which of the main input device and the at least one secondary input device to operate the monitors according to the table. | 10-17-2013 |
20130271360 | INTERACTING WITH A DEVICE USING GESTURES - Systems, methods, apparatuses, and computer-readable media for are provided for engaging and re-engaging a gesture mode. In one embodiment, a method performed by the computer system detects an initial presence of a user pose, indicates to a user progress toward achieving a predetermined state while continuing to detect the user pose, determines that the detection of the user pose has reached the predetermined state, and responds to the detection of the user pose based on determining that the detection has reached the predetermined state. The computer system may further prompt the user by displaying a representation of the user pose corresponding to an option for a user decision, detecting the user decision based at least in part on determining that the detection of the user pose has reached the predetermined state, and responding to the user decision. | 10-17-2013 |
20130271361 | METHOD AND APPARATUS FOR DETECTING TALKING SEGMENTS IN A VIDEO SEQUENCE USING VISUAL CUES - A method and system for detecting temporal segments of talking faces in a video sequence using visual cues. The system detects talking segments by classifying talking and non-talking segments in a sequence of image frames using visual cues. The present disclosure detects temporal segments of talking faces in video sequences by first localizing face, eyes, and hence, a mouth region. Then, the localized mouth regions across the video frames are encoded in terms of integrated gradient histogram (IGH) of visual features and quantified using evaluated entropy of the IGH. The time series data of entropy values from each frame is further clustered using online temporal segmentation (K-Means clustering) algorithm to distinguish talking mouth patterns from other mouth movements. Such segmented time series data is then used to enhance the emotion recognition system. | 10-17-2013 |
20130271362 | Method, System and Program Product for Enhancing a Graphical User Interface - A method, system and program product comprise detecting a first instance of an input device in proximity of a surface of a computer interface. Information regarding the input device and a first area of the surface in proximity of the input device is obtained. Elements of the first area are enhanced. The enhancing at least in part is determined by the information in which the enhancing facilitates selection of at least one of the elements. | 10-17-2013 |
20130271363 | Electronic Remote Control Thimble - The Thimble is a wireless electronic device that can be worn on a finger, in order to control electronic devices, computers screens, laptops screen, televisions game console etc. The Thimble can replace either a computer mouse or a remote control. The Thimble uses location technologies in order to calculate its position, movement and orientation. Touch pads allow for accepting user selection, similar to mouse left-clicks or right-clicks, or remote control OK function. | 10-17-2013 |
20130271364 | Human Stimulus Activation and Deactivation of a Screensaver - Devices and methods are disclosed which relate to an electronic device having a human stimulus receptor which, when activated, suspends activation of a screensaver. The screensaver is activated to conserve the power and life of the electronic device. When latently viewing the electronic device, however, the human stimulus receptor is activated. A countdown starts counting down a pre-determined amount of time once the human stimulus receptor is inactive. At the expiration of the countdown, the screensaver is activated. The human stimulus receptor responds to skin conductivity, natural muscular twitch, pulse, skin temperature, and/or eye movement. Only when the electronic device no longer detects any of these human stimuli will the countdown begin. A user may set the predetermined amount of time. | 10-17-2013 |
20130271365 | IMAGE MAGNIFICATION BASED ON DISPLAY FLEXING - Systems and methods control resizing a presentation of an image on a flexible display. An initial presentation of an image is provided on a flexible display. A first flexing of the flexible display away from an unflexed configuration is determined and a return of the flexible display to the unflexed configuration is determined within a defined time period after determining the first flexing. A second flexing of the flexible display away from the unflexed configuration is determined within the defined time period after determining the first flexing and after determining the return. The first flexing is separate from the second flexing. At least a portion of the initial presentation is resized in a first manner in response to determining the first flexing and in response to determining the second flexing within the defined time period after determining the first flexing. | 10-17-2013 |
20130271366 | IMAGE DISPLAY DEVICE, IMAGE DISPLAY METHOD, AND PROGRAM - An image display device includes: a display unit that displays images; an imaging unit that captures an image of a subject that faces the display unit; a detection unit that detects motion with respect to the image display device by the user of the image display device; an estimation unit that, when predetermined motion has been detected by the detection unit, estimates the inclination with respect to the display unit of a person's face that was captured by the imaging unit; and a display orientation alteration unit that alters the orientation of images displayed on the display unit according to the inclination that was estimated. | 10-17-2013 |
20130271367 | DISPLAY MANAGEMENT APPARATUS, PROGRAM, AND DISPLAY MANAGEMENT METHOD - A controller causes a display unit to display N candidate values selected from natural numbers in a target range having a predetermined base value as the maximum value, as candidates to be selected by a user. The controller selects M values that are multiples of a plurality of different reference values, from among the natural numbers in the target range as candidate values, and causes the display unit to display a candidate value list that lists N candidate values that include the M values. | 10-17-2013 |
20130271368 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD AND COMPUTER PROGRAM - Provided is an information processing device capable of executing a larger variety of processes on screen data on the basis of motion of the device body. | 10-17-2013 |
20130278492 | DISTRIBUTED, PREDICTIVE, DICHOTOMOUS DECISION ENGINE FOR AN ELECTRONIC PERSONAL ASSISTANT - A system, method and user interface are described for providing a personal assistant functionality using a predictive, adaptive, dichotomous (two choices) decision engine that proactively prompts the user for decisions on matters deemed relevant by the decision engine based on past user decisions and activities. | 10-24-2013 |
20130278493 | GESTURE CONTROL METHOD AND GESTURE CONTROL DEVICE - A gesture control method includes steps of capturing at least one image; detecting whether there is a face in the at least one image; if there is a face in the at least one image, detecting whether there is a hand in the at least one image; if there is a hand in the at least one image, identifying a gesture performed by the hand and identifying a relative distance or a relative moving speed between the hand and the face; and executing a predetermined function in a display screen according to the gesture and the relative distance or according to the gesture and the relative moving speed. | 10-24-2013 |
20130278494 | THREE-DIMENSIONAL INTERACTIVE SYSTEM - A three-dimensional interactive system includes at least one image capturing device, a processor and a display. The at least one image capturing device is used for sensing positional changes of an object with time along three axes in three-dimensions. The processor is used for generating image data according to the positional changes of the object with time along the three axes. The display is used for displaying the image data. | 10-24-2013 |
20130278495 | All New One Stroke Operation Control Devices - The invention is an all new One Stroke operation and control device, which consists of trajectory input and trajectory recognition subsystem. Specifically, users may use their hands to directly engage in operation and control. A single person could use both hands to operate and control. Therefore, the invention facilitates an easy-to-learn and easy-to-use operation and control which does not rely on any traditional tool such as a remote controller. One Stroke devices even are able to provide reverse manipulation. In summary, most users will have an all new easy operation and control. | 10-24-2013 |
20130278496 | ELECTRONIC DISPLAY DEVICE AND METHOD FOR ADJUSTING USER INTERFACE - An electronic display device acquires an image of a user in front of a screen of the device. The electronic display device calculates an area of the user in the image of the user and calculates an area ratio of the area of the user in the image of the user to a predefined value. The electronic display device provides a plurality of user interfaces and a plurality of ratio scopes corresponding to the plurality of user interfaces, and selects one of the plurality of user interfaces corresponding to one of the plurality of ratio scopes according to the area ratio located to display the one of the plurality of user interfaces on the screen. | 10-24-2013 |
20130278497 | VIRTUAL IMAGE DISPLAY APPARATUS - An attitude information detection unit detects the attitude of a wearer, and an arithmetic processing unit and a device position drive unit, which form an image area position adjustment mechanism, translate a liquid crystal display device that is a light modulating device relative to a light guide section in a plane parallel to a light incident surface of the light guide section based on a detection result from the attitude information detection unit, whereby the position of an image area can be so changed that an image moves in the direction opposite the direction in which the wearer moves. As a result, when the wearer moves, the image does not follow the motion of the wearer as if the image remained still, whereby the wearer who moves will have a reduced amount of unpleasant sensation resulting from the motion. | 10-24-2013 |
20130278498 | PRESSURE SENSITIVE TOUCH PANEL AND PORTABLE TERMINAL INCLUDING THE SAME - A pressure sensitive touch panel is provided, which includes first sensor lines arranged along a first axis; second sensor lines arranged along a second axis that cross the first axis; a drive unit that sequentially applies a scan signal to the first sensor lines, and sequentially detects detection signals of the second sensor lines; and a controller that controls the drive unit to selectively apply the scan signal to one of the first sensor lines and to selectively detect a detection signal from one of the second sensor lines. | 10-24-2013 |
20130278499 | GESTURE INPUT WITH MULTIPLE VIEWS, DISPLAYS AND PHYSICS - Gesture input with multiple displays, views, and physics is described. In one example, a method includes generating a three dimensional space having a plurality of objects in different positions relative to a user and a virtual object to be manipulated by the user, presenting, on a display, a displayed area having at least a portion of the plurality of different objects, detecting an air gesture of the user against the virtual object, the virtual object being outside the displayed area, generating a trajectory of the virtual object in the three-dimensional space based on the air gesture, the trajectory including interactions with objects of the plurality of objects in the three-dimensional space, and presenting a portion of the generated trajectory on the displayed area. | 10-24-2013 |
20130278500 | SIDE-TYPE FORCE SENSE INTERFACE - A force sense interface includes a tactile finger base having a plurality of tactile fingers, which are capable of tracking motions of the fingers of a hand of an operator, an arm mechanism, which allow spatial motion of the tactile finger base, and a controller, which controls the arm mechanism in accordance with the position and posture of the hand, and controls the tactile fingers in accordance with the movements of the fingers of the operator. The force sense interface further includes finger holders for attaching the tactile fingers on the fingers of the operator in a state where the tactile finger base is distanced from the back of the hand of the operator (H) so as to face the back of the hand. | 10-24-2013 |
20130285893 | Upload An Image to a Website Server Using a Pointing Device - Uploading an image to a website server receives position data defining an image area on a display screen of an image to be uploaded. An image file is created of the image area and uploads the image to the website server. In some examples, the position data are provided by a pointing device with an image upload button. | 10-31-2013 |
20130285894 | PROCESSING IMAGE INPUT TO COMMUNICATE A COMMAND TO A REMOTE DISPLAY DEVICE - A method is disclosed for operating a mobile device. The method is performed by one or more processors of the mobile device. The one or more processors process image input on the mobile device in order to detect one or more graphic objects displayed on a remote display device and to detect one or more fingers of a user in relation to the one or more graphic objects. From processing the image input, a command for the remote display device is determined based on a position or movement of the one or more fingers in relation to the one or more graphic objects. The command is communicated to the remote display device. | 10-31-2013 |
20130285895 | Graphical Programming System with Native Access to External Memory Buffers - A system and method for enabling a graphical program to natively access an external memory buffer are disclosed. The graphical program may execute within a graphical program execution environment, and the external memory buffer may be allocated by another program that executes externally from the graphical program and the graphical program execution environment. The graphical program may be executed concurrently with a producer program that stores data in the memory buffer, and/or with a consumer program that reads and uses the data from the memory buffer. The memory buffer may be located within a region of memory allocated by the producer program, by the consumer program, or by another program that executes externally from the graphical program and the graphical program execution environment, such as a memory manager program. | 10-31-2013 |
20130285896 | INTERACTIVE DISPLAY DEVICE AND CONTROL METHOD THEREOF - An interactive display device which senses an external device and operates according to the sensed result and a control method thereof are disclosed. A method for controlling an interactive display device includes detecting an external device on a surface of the interactive display device, wherein the external device includes a plurality of identification markers sensed by the interactive display device, scanning a shape of the external device, sensing signals of the identification markers, determining coordinate information and contact information of the external device on the surface of the interactive display device based on a result of scanning the shape and sensing the signals, wherein the contact information indicates which side of the external device is in contact with the interactive display device, and displaying data around the external device according to the determined coordinate information and contact information. | 10-31-2013 |
20130285897 | FLAT DISPLAY SYSTEM AND OPERATION METHOD THEREOF - A flat display system includes a first flat device and a second flat device. The second flat device communicates with the first flat device via wireless communication. When the first flat device is located within a predetermined distance from the second flat device, the first flat device is transformed into an input device of the second flat device. | 10-31-2013 |
20130285898 | SYSTEM AND METHOD FOR IMPLEMENTING USER INTERFACE - A system has an input device for collecting gesture information of a user, a computing device for processing the gesture information collected by the input device, a memory for storing information of executive trajectories for executing various functions, and an output device for displaying the information processed by the computing device, wherein each executive trajectory is defined as an executive input for each function, if a gesture performed by the user completes one of the executive trajectories, a function corresponding to the corresponding executive trajectory is executed, wherein, if the user starts a gesture, the computing device compares a path of a trajectory of a gesture performed by the user with start paths of the executive trajectories and selects candidate trajectories having similarity higher than a preset criterion, and wherein the candidate trajectories are displayed by the output device to suggest path information of the candidate trajectories to the user. | 10-31-2013 |
20130285899 | METHOD FOR OUTPUTTING COMMAND BY DETECTING OBJECT MOVEMENT AND SYSTEM THEREOF - The present invention discloses a method for outputting a command by detecting a movement of an object, which includes the following steps. First, an image capturing device captures images generated by the movement of the object at different timings by. Next, a motion trajectory is calculated according to the plurality of images. Further next, a corresponding command is outputted according to the motion trajectory. The present invention also provides a system which employs the above-mentioned method. | 10-31-2013 |
20130285900 | MANIPULATING DEVICE AND DIRECTIONAL INPUT APPARATUS USING THE SAME - A manipulating device includes a first coil, an action unit, a control unit, a modulation unit, and a second coil. The first coil receives a charging signal to incur electromagnetic resonance, so as to generate an electric signal. The action unit generates an action signal. The control unit alternately generates a first control signal according to the electric signal in a first period and a second control signal according to the action signal in a second period. The modulation unit modulates the first control signal and the second control signal to generate a first modulated signal. The second coil generates an output signal according to the first modulated signal. A directional input apparatus uses the manipulating device and a sensing device to generate a directional control input signal. Therefore, the manipulating device uses the modulation unit and the second coil to transmit the output signal to the sensing device. | 10-31-2013 |
20130285901 | SYSTEM AND METHOD FOR TRACKING GAZE AT DISTANCE - A system and method for tracking a gaze at a distance are provided. A remote gaze tracking system may include an infrared lighting unit including a plurality of infrared lightings to emit an infrared light toward a user, a gaze tracking module to track a position of a face of the user, and to collect, from the tracked position of the face, an eye image including at least one reflected light among a plurality of corneal reflected lights and a lens-reflected light, the corneal reflected lights being reflected from a cornea by the emitted infrared light, and the lens-reflected light being reflected from a lens of glasses, and a processor to compare a magnitude of the lens-reflected light with a threshold in the collected eye image, and when the magnitude of the lens-reflected light is equal to or less than the threshold, to detect coordinates of a center of each of the plurality of corneal reflected lights, and to calculate a gaze position. | 10-31-2013 |
20130285902 | SYSTEM AND METHOD FOR ADJUSTING PRESENTATION OF MOVING IMAGES ON AN ELECTRONIC DEVICE ACCORDING TO AN ORIENTATION OF THE DEVICE - The invention relates to a system, method and device for changing a notional viewing location for a moving image on a device, depending on an orientation of the device. For the moving image management system, it comprises: a sensor; a movement detection module connected to the sensor providing movement data registering a notable signal from the sensor; and a moving image adjustment module determining a new viewing location of the moving image utilizing the movement data and generating a replacement moving image for the moving image representing the moving image as viewed from the new viewing location. | 10-31-2013 |
20130293452 | CONFIGURABLE HEADS-UP DASH DISPLAY - Methods and systems for a heads-up configurable vehicle dash display are provided. Specifically, a configurable dash may comprise one or more displays that are capable of receiving input from a user. At least one of these displays may be configured to present a plurality of custom applications that, when manipulated by at least one user, are adapted to control and/or monitor functions associated with a vehicle and/or associated peripheral devices. It is anticipated that the function and appearance of the plurality of custom applications may be altered via user and/or processor input. | 11-07-2013 |
20130293453 | FLEXIBLE DISPLAY DEVICE AND METHOD OF TRANSFERRING DATA BETWEEN FLEXIBLE INTERFACE DEVICES - A method of transferring data between a plurality of flexible interface devices, includes an operation in which the flexible interface devices are flexed due to external force from a user, an operation in which the flexible interface devices acquire flex information generated when the flexible interface devices are flexed, through a plurality of flex sensors, and an operation in which at least one of the flexible interface devices transfers data between the flexible interface devices based on the flex information. | 11-07-2013 |
20130293454 | TERMINAL AND METHOD FOR CONTROLLING THE SAME BASED ON SPATIAL INTERACTION - A terminal and method for controlling the terminal using spatial gesture are provided. The terminal includes a sensing unit which detects a user gesture moving an object in a certain direction within proximity the terminal, a control unit which determines at least one of movement direction, movement speed, and movement distance of the user gesture and performs a control operation associated with a currently running application according to the determined at least one of movement direction, movement speed, and movement distance of the user gesture, and a display unit which displays an execution screen of the application under the control of the control unit. | 11-07-2013 |
20130293455 | METHOD FOR DETERMINING BENT STATE OF ELECTRONIC DEVICE, ELECTRONIC DEVICE AND FLEXIBLE SCREEN - A method for determining a bent state of an electronic device, an electronic device and a flexible screen are provided. The method includes: obtaining at least one parameter value, the at least one parameter value being obtained by the detecting device in the case that the flexible screen is bent; and determining a bent state of the flexible screen based on the at least one parameter value. The invention can be adapted to detect the bent state of the flexible screen, for improving the efficiency and the work accuracy of the detection. | 11-07-2013 |
20130293456 | APPARATUS AND METHOD OF CONTROLLING MOBILE TERMINAL BASED ON ANALYSIS OF USER'S FACE - An apparatus and method of controlling a mobile terminal by detecting a face or an eye in an input image are provided. The method includes performing face recognition on an input image facing and being captured by an image input unit equipped on the front face of the mobile terminal; determining, based on the face recognition, user state information that includes whether a user exists, a direction of the user's face, a distance from the mobile terminal, and/or a position of the user's face; and performing a predetermined function of the mobile terminal according to the user state information. According to the method, functions of the mobile terminal may be controlled without direct inputs from the user. | 11-07-2013 |
20130293457 | TERMINAL AND METHOD FOR IRIS SCANNING AND PROXIMITY SENSING - A method of iris scanning and proximity sensing includes: receiving selection information of an operation mode; sensing an iris including emitting light having an amount of light of a first level, and photographing an iris using the emitted light if a selected operation mode is a iris scanning mode; sensing a proximity including emitting light having an amount of light of a second level, and sensing information on whether an object has approached using the emitted light if the selected operation mode is a proximity sensing mode; and recognizing the iris using the photographed iris image, and performing a function according to the sensed information on whether the object has approached, and the first level has a value higher than the value of the second level. | 11-07-2013 |
20130293458 | METHOD FOR CONTROLLING A DISPLAY APPARATUS USING A CAMERA BASED DEVICE AND MOBILE DEVICE, DISPLAY APPARATUS, AND SYSTEM THEREOF - A method of controlling a display apparatus using a camera based device and a mobile device, display apparatus, and system applying the same are provided. The method for controlling the display apparatus with the mobile device includes capturing an entity displayed which is on the display apparatus, in response to a movement of the mobile device, sensing at least one of a movement and a change of size of the captured entity, generating a control signal for controlling the display apparatus based on at least one of the sensed movement, and change of size of the entity and transmitting the generated control signal to the display apparatus. | 11-07-2013 |
20130293459 | IMAGE DISPLAY DEVICE WITH IMAGING UNIT - An image display unit comprises a display area including pixels, and an imaging unit disposed at its rear side behind the display area. Light transmissive sections are located within the display area and correspond to the pixels. The light transmissive sections are configured to separately receive light incident upon the image display unit, and pass the received light to the imaging unit. | 11-07-2013 |
20130293460 | COMPUTER VISION BASED CONTROL OF AN ICON ON A DISPLAY - A method for computer vision based hand gesture device control, is provided. The method includes receiving a sequence of images of a field of view; applying a shape recognition algorithm on the sequence of images to detect a shape of a first posture of a hand; and generating a command to initiate device control based on the detection of the shape of the first posture of the hand. | 11-07-2013 |
20130293461 | Method And System For Determining How To Handle Processing Of An Image Based On Motion - A mobile multimedia device may be operable to initiate capture of a series of image samples of a scene, where the scene may comprise one or more objects that may be identifiable by the mobile multimedia device. An image for the scene may be determined by the mobile multimedia device utilizing the captured image samples based on motion associated with the identifiable objects. | 11-07-2013 |
20130293462 | METHOD AND APPARATUS FOR CONTROLLING A COMPUTING SYSTEM - A handheld computing device is introduced comprising a motion detection sensor(s) and a motion control agent. The motion detection sensor(s) detect motion of the computing device in one or more of six (6) fields of motion and generate an indication of such motion. The motion control agent, responsive to the indications of motion received from the motion sensors, generate control signals to modify, one or more of the operating state and/or the displayed content of the computing device based, at least in part, on the received indications. | 11-07-2013 |
20130300643 | SYSTEMS, METHODS, AND DEVICES FOR EVALUATING SIGNAL QUALITY AND RANGE - Systems, methods, and devices for evaluating wireless signal quality between environmental sensing and control devices. A signal quality device includes a sensor module, one or more user control devices, one or more display devices, and a processing unit. The signal quality device is configured to generate a first signal following activation of at least one of the one or more user control devices, wirelessly transmit the first signal, receive a second signal in response to the first signal, and activate the one or more display devices to provide an indication of the signal quality of the first signal. A wireless environmental controller includes a processing unit and is configured to receive the first signal, determine the signal quality of the first signal, generate the second signal related to the signal quality of the first signal, and wirelessly transmit the second signal. | 11-14-2013 |
20130300644 | System and Methods for Controlling a User Experience - System and methods for controlling a user experience are described. In an aspect, an interface can comprise an interface device for rendering content to a user, a sensor having a gesture zone associated therewith configured to detect a dexterous gesture of a user within the gesture zone and generate a sensor signal representing the dexterous gesture. A processor may be provided in communication with the sensor and the interface device, wherein the processor receives the sensor signal, analyzes the sensor signal to determine a control action associated with the detected dexterous gesture of the user, and configures the user interface based upon the determined control action of the user. | 11-14-2013 |
20130300645 | Human-Computer Interface System - A human-computer interface system which includes a computer, the computer includes a central processing unit, a first memory, a second memory, at least one storage device, a system hub, an input/output adapter for connecting the at least one storage device to the system hub, an operating system installed onto the at least one storage device, applications software installed onto the at least one storage device, a communication network, the computer being connected to the communication network, a means for inputting and detecting data and commands generated by a user, a plurality of output converters, and a means for allowing the computer to develop and express the computer's emotional states. | 11-14-2013 |
20130300646 | GRAPHIC CARD FOR COLLABORATIVE COMPUTING THROUGH WIRELESS TECHNOLOGIES - A graphics card is provided. The graphics card comprises: a Graphics Processing Units (GPU) for data computing; and a wireless controller for wirelessly receiving data from other graphic cards or sending data to the other graphics cards, and communicating with the GPU by bus. The graphic card able provided by the present invention can provide a low-cost solution with more powerful computing capabilities to meet the demands for computing complex problems in the fields of commerce, industry, and science. | 11-14-2013 |
20130300647 | Accessory Device Architecture - An accessory device architecture is described. In one or more implementations, data is received from an accessory device at an intermediate processor of a computing device, the data usable to enumerate functionality of the accessory device for operation as part of a computing device that includes the intermediate processor. The data is passed by the intermediate processor to an operating system executed on processor of the computing device to enumerate the functionality of the accessory device as part of the intermediate processor. | 11-14-2013 |
20130300648 | AUDIO USER INTERACTION RECOGNITION AND APPLICATION INTERFACE - Disclosed is an application interface that takes into account the user's gaze direction relative to who is speaking in an interactive multi-participant environment where audio-based contextual information and/or visual-based semantic information is being presented. Among these various implementations, two different types of microphone array devices (MADs) may be used. The first type of MAD is a steerable microphone array (a.k.a. a steerable array) which is worn by a user in a known orientation with regard to the user's eyes, and wherein multiple users may each wear a steerable array. The second type of MAD is a fixed-location microphone array (a.k.a. a fixed array) which is placed in the same acoustic space as the users (one or more of which are using steerable arrays). | 11-14-2013 |
20130300649 | Headset Computer Operation Using Vehicle Sensor Feedback for Remote Control Vehicle - A system performs stable control of moving devices (such as a helicopter or robot) with attached camera(s), providing live imagery back to a head-mounted computer (HMC). The HMC controls the moving device. The HMC user specifies a desired path or location for the moving device. Camera images enable the user-specified instructions to be followed accurately and the device's position to be maintained thereafter. A method of controlling a moving device with a headset computer includes analyzing, at the headset computer, at least one image received from the moving device to form an indication of change in position of the moving device. The method displays to a user of the headset computer the indication of change in position of the moving device. The method can additionally include enabling the user to control the moving device. | 11-14-2013 |
20130300650 | CONTROL SYSTEM WITH INPUT METHOD USING RECOGNITIOIN OF FACIAL EXPRESSIONS - Disclosure is related to a control system with an input method using recognition of facial expressions. The system includes an image capturing unit, an image processing unit, a database, and a computing unit. The image capturing unit captures an input image having a facial expression when a user uses lip language. The image processing unit, connected with the image capturing unit, is used to receive and recognize the facial expression shown in the input image. The database stores a plurality of reference images and each of which indicates a corresponding control command. The computing unit, connected with the image processing unit and the database, performs comparison between the facial expression recognized by the image processing unit and the reference images retrieved from the database. The result of comparison finds out the control command which is used to operate an electronic device by this control system. | 11-14-2013 |
20130300651 | APPARATUS AND METHOD FOR CONTROLLING ELECTRONIC DEVICE - Provided are an apparatus and method for controlling an electronic device. The apparatus includes a plurality of sensors to detect manipulation by a user, a control unit to recognize a motion pattern based on the user manipulations detected by the plurality of sensors and to determine an operation to be executed in accordance with the recognized user's motion pattern, and a transmitting unit to transmit a digital signal for an electronic device to execute the operation determined by the control unit. | 11-14-2013 |
20130300652 | Unlocking a Screen Using Eye Tracking Information - Methods and systems for unlocking a screen using eye tracking information are described. A computing system may include a display screen. The computing system may be in a locked mode of operation after a period of inactivity by a user. Locked mode of operation may include a locked screen and reduced functionality of the computing system. The user may attempt to unlock the screen. The computing system may generate a display of a moving object on the display screen of the computing system. An eye tracking system may be coupled to the computing system. The eye tracking system may track eye movement of the user. The computing system may determine that a path associated with the eye movement of the user substantially matches a path associated with the moving object on the display and switch to be in an unlocked mode of operation including unlocking the screen. | 11-14-2013 |
20130300653 | GAZE DETECTION IN A SEE-THROUGH, NEAR-EYE, MIXED REALITY DISPLAY - The technology provides various embodiments for gaze determination within a see-through, near-eye, mixed reality display device. In some embodiments, the boundaries of a gaze detection coordinate system can be determined from a spatial relationship between a user eye and gaze detection elements such as illuminators and at least one light sensor positioned on a support structure such as an eyeglasses frame. The gaze detection coordinate system allows for determination of a gaze vector from each eye based on data representing glints on the user eye, or a combination of image and glint data. A point of gaze may be determined in a three-dimensional user field of view including real and virtual objects. The spatial relationship between the gaze detection elements and the eye may be checked and may trigger a re-calibration of training data sets if the boundaries of the gaze detection coordinate system have changed. | 11-14-2013 |
20130300654 | DISPLAY CONTROL DEVICE AND DISPLAY CONTROL METHOD - A display control device | 11-14-2013 |
20130307762 | METHOD AND APPARATUS FOR ATTRACTING A USER'S GAZE TO INFORMATION IN A NON-INTRUSIVE MANNER - Methods, apparatuses, and computer program products are herein provided for attracting a user's gaze to information associated with a portion of a display in a non-intrusive manner. A method may include determining to attract a user's gaze to information associated with a portion of a display. The method may further include causing presentation of a visual attractant on the display proximate the portion of the display. The method may further include causing presentation of the visual attractant on the display to be ceased in an instance in which the user's gaze is determined to be moving toward the information. Corresponding apparatuses and computer program products are also provided. | 11-21-2013 |
20130307763 | FIELD ANALYZER - A visual display of the modulation envelope of an amplitude-modulated RF electric field produced by a field analyzer comprising a field sensor for generating digital samples of the field, a field processor connected to the field sensor for generating a web page, and a personal computer for retrieving and displaying the web page. By using a web page and displaying the web page on a personal computer, it is possible to carry out tasks, such as correcting for nonlinearity of a detector in the field sensor, in the personal computer where they can be performed more efficiently. | 11-21-2013 |
20130307764 | METHOD, APPARATUS, AND SYSTEM FOR ADAPTING THE PRESENTATION OF USER INTERFACE ELEMENTS BASED ON A CONTEXTUAL USER MODEL - A method, apparatus, and system for adapting the presentation of user interface elements based on a contextual user model includes using passive interaction data, such as gaze-tracking inputs and/or certain proximity inputs, to determine an aspect of the user's current interaction context (e.g., the user's current focus of attention or current hand position). User interface elements may be changed or relocated based on the user's current interaction context. | 11-21-2013 |
20130307765 | Contactless Gesture-Based Control Method and Apparatus - A contactless gesture-based control method and apparatus are provided. The method includes: obtaining gestures/movements of a user; obtaining control instructions corresponding to the gestures/movements according to pre-stored mapping relationships; and executing the control instructions, where the gestures/movements include: an index finger draws a circle, and remaining four fingers clench into a fist; or, five fingers clench together and a palm moves in a direction where a device is; or, a thumb, the index finger, and a middle finger extend naturally, the remaining two fingers clench together, and the index finger moves freely; or, the thumb, the index finger, and the middle finger extend naturally, the remaining two fingers clench together, and the thumb moves up and down; or the thumb, the index finger, and the middle finger extend naturally, the remaining two fingers clench together, and the middle finger moves up and down. | 11-21-2013 |
20130307766 | USER INTERFACE SYSTEM AND METHOD OF OPERATION THEREOF - A method to provide a user interface, the method may be controlled by one or more controllers and may include one or more acts of obtaining image information from a sequence of images of a user; analyzing the image information to recognize the user; recognizing first and second reference objects at least one of which corresponding with respective body parts of the user from the image information; determining whether a first reference object has been placed over or is within a threshold scaled distance of a second reference object; calculating an interaction time between the first reference object and the second reference object when it is determined that the first reference object has been placed over or is within the threshold scaled distance of the second reference object; comparing the interaction time with a threshold reference time; and performing an action in accordance with results of the comparison. | 11-21-2013 |
20130307767 | DISPLAY DEVICE AND DISPLAY METHOD - A display device in which continuity of content displayed on a screen is maintained between before and after a rotation of the screen, including: a display screen which is fixed to the display device and displays a partial region of an image; an operation detecting unit which specifies a position in the partial region of the image displayed on the display screen; a rotation detecting unit which detects rotation information which indicates a rotation of the display device in a plane including the display screen; and a display image determining unit which updates display content on the display screen to display the partial region of the image after the rotation on the display screen by rotating, based on the rotation information detected by the detecting unit, the partial region of the image around the position specified by the specifying unit. | 11-21-2013 |
20130307768 | DISPLAY DEVICE AND CONTROL METHOD THEREOF - Disclosed are a display device and a control method thereof. The display device and the control method include a camera acquiring an image including a gesture made by a user, and a controller extracting an object making the gesture from the image acquired by the camera, and setting a specific spot in the extracted object to be a reference point of a movement of the object, the controller fixing the reference point to a set location regardless of a change in a shape of the extracted object. Accordingly, a reference point is set at a specific spot of an object having made a gesture corresponding to the acquisition of control thereon, thereby allowing for the accurate and effective recognition of a gesture made by a user. | 11-21-2013 |
20130307769 | MOBILE DEVICE ACTIVATION BY USER GRASP - Method and apparatus for mobile device activation by user grasp. An embodiment of a mobile device includes a cover, the cover including at least a first side and a least a first corner, the first corner adjoining the first side, with a concave indentation in the cover, the concave indentation being located at the first corner of the mobile device. The mobile device further includes a first touch sensor located in the concave indentation, the first touch sensor to generate a signal upon physical contact or proximity with a user of the mobile device, and an activation module, the activation module to transition the mobile device from a deactivated state to an activated state upon receiving at least the first signal from the first touch sensor. | 11-21-2013 |
20130314311 | SYSTEM FOR PROJECTING 3D IMAGES AND DETECTING GESTURES - A three dimensional (3D) imaging system configured to display an autostereoscopic image of a scene toward a viewing area, and detect gestures occurring in the viewing area. The system includes an imaging device configured to project a plurality of projected images in distinct directions, and each projected image is characterized as a distinct perspective view of the scene. The imaging device is also configured to detect a plurality of received images for the purpose of detecting gestures. The system also includes a holographic diffuser, and a mirror arrangement configured to reflect the plurality of projected images from the imaging device toward the holographic diffuser to display an autostereoscopic image of the scene in the holographic diffuser, and reflect a plurality of perspective images from the viewing area toward the imaging device such that each received image corresponds to a distinct perspective view of the viewing area. | 11-28-2013 |
20130314312 | FULL RANGE GESTURE SYSTEM - This disclosure relates to an interactive display, having a front surface including a viewing area, and providing an input/output interface for a user of an electronic device. A planar light guide (PLG) disposed substantially parallel to the front surface, has a periphery at least coextensive with the viewing area. A light-emitting source (LES), disposed outside the periphery of the PLG, is optically coupled with a PLG input. The PLG outputs reflected light, in a direction substantially orthogonal to the front surface, by reflecting light received from the LES. A light collecting device (LCD) collects scattered light that results from interaction of the reflected light with a user-controlled object. The LCD redirects the collected scattered light toward one or more light sensors. A processor recognizes, from outputs of the light sensors, an instance of a user gesture, and controls the interactive display and/or electronic device, responsive to the user gesture. | 11-28-2013 |
20130314313 | DISPLAY WITH CODING PATTERN - A display ( | 11-28-2013 |
20130314314 | IMAGE DISPLAY APPARATUS - An image display apparatus includes a liquid crystal panel with a dual-view function to display two different first and second screen-sized images that are visible in a driver-seat direction and a passenger-seat direction, respectively. The first screen-sized image in the driver-seat direction contains a guidance image and a guidance manipulation image; the second screen-sized image in the passenger-seat direction contains a video image and a video manipulation image. When an occupant in the driver seat manipulates an icon within the guidance manipulation image on the liquid crystal display panel, the same guidance manipulation image is also displayed in the passenger-seat direction by replacing the video manipulation image. | 11-28-2013 |
20130321255 | NAVIGATING CONTENT IN AN HMD USING A PHYSICAL OBJECT - Technology is disclosed herein to help a user navigate through large amounts of content while wearing a see-through, near-eye, mixed reality display device such as a head mounted display (HMD). The user can use a physical object such as a book to navigate through content being presented in the HMD. In one embodiment, a book has markers on the pages that allow the system to organize the content. The book could have real content, but it could be blank other than the markers. As the user flips through the book, the system recognizes the markers and presents content associated with the respective marker in the HMD. | 12-05-2013 |
20130321256 | METHOD AND HOME DEVICE FOR OUTPUTTING RESPONSE TO USER INPUT - Discussed are a method and a home device for efficiently providing a user with a response to a user input when the home device is outputting audio and/or video content. The method includes outputting content including a video output and/or an audio output, receiving a user input, detecting an output condition of a response to the user input of the home device, and outputting the response to the user input. The output condition includes a audio condition of the content denoting importance of the audio output, a video condition of the content denoting importance of the video output, and a user detection condition denoting whether or not a user is detected around the home device, and the response to the user input includes an audio response and/or a video response. Output of the audio response and/or the video response is adjusted based on the detected output condition. | 12-05-2013 |
20130321257 | Methods and Apparatus for Cartographically Aware Gestures - Methods and apparatus for a map tool on a mobile device for implementing cartographically aware gestures directed to a map view of a map region. The map tool may base a cartographically aware gesture on an actual gesture input directed to a map view and based on map data for the map region that may include metadata corresponding to elements within the map region. The map tool may then determine, based on one or more elements of the map data, a modification to be applied to an implementation to the gesture. Given the modification to the gesture implementation, the map tool may then render, based on performing the modification to the gesture, an updated map view instead of an updated map view based solely on the user gesture. | 12-05-2013 |
20130321258 | TWO-WAY COMMUNICATION SYSTEM BY USING EYE MOVEMENTS - A two-way communication system by using eye movements of the present invention comprises a first display module, a second display module and a data processing module. The first display module is configured to display first operative window interface and to detect the eye movements of the users to provide various operation and control. The second display module is configured to display user information, which allows the users to communicate with others immediately and conveniently. The data processing module is electrically connected with the first display module and the second display module, which responses each operation command and transmits the information. The first and second display modules can be adjusted freely for users' convenient review and operation. | 12-05-2013 |
20130321259 | Double Sided Advertisement Billboard for Parking Lots - A Double Sided Advertisement Billboard for Parking Lots is a viewing panel that is positioned between two opposite parking spaces and elevated off the ground by supports in order to display advertising materials to people utilizing the parking spaces. The viewing panel has two opposite sides with identical components which can have different advertisements displayed utilizing different methods, including digital screens, poster boards, product cutouts or other features such as neon signs. A motion sensor detects the presence of a vehicle in the parking space in order to activate a light, screen or speaker. A user interface allows viewers to participate in interactive advertisements. | 12-05-2013 |
20130321260 | APPARATUS AND METHOD FOR DISPLAYING A SCREEN USING A FLEXIBLE DISPLAY - A method of displaying a display screen by using a display that is flexible is disclosed. The method includes detecting deformation of the display; determining whether an object is displayed within a set range from a folding line at which the deformation is detected; and modifying the displayed objected and displaying the modified displayed object when the object is displayed within the set range. | 12-05-2013 |
20130321261 | RECOGNITION APPARATUS - The acquiring unit is configured to acquire positions of a particular part of a user in time series. The calculator is configured to calculate a feature value of a motion of the particular part from the positions acquired in time series. The comparing unit configured to compare the feature value and a first threshold. The recognizing unit is configured to recognize the motion of the particular part using a first recognition method and a second recognition method which is different from the first recognition method. The control unit configured to control the recognizing unit to select the first recognition method for recognizing a position or a motion of the particular part when the feature value is smaller than the first threshold, and select the second recognition method when the feature value is equal to or larger than the first threshold. | 12-05-2013 |
20130321262 | OPERATING SYSTEM WITH HAPTIC INTERFACE FOR MINIMALLY INVASIVE, HAND-HELD SURGICAL INSTRUMENT - A haptic system for a minimally invasive, hand-held surgical instrument and the system's various parts including a graphical user haptic interface, one or more haptic interfaces associated with a hand-held handle used to control a sensorized end-effector of the surgical instrument or inserted catheters, associated hardware, and an operating system. The system enables users to acquire, read, modify, store, write, and download sensor-acquired data in real time. The system can provide: an open, universally compatible platform capable of sensing or acquiring physiological signals/data in any format; processing of the sensor acquired data within an operating system; and outputting the processed signals to hardware which generates tangible sensations via one or more haptic interfaces. These tangible sensations can be modified by the user in real time as the system ensures the temporal relationship of sensed fiducial events are not altered or shifted relative to the generated and displayed haptic signals. | 12-05-2013 |
20130321263 | SYSTEM AND METHOD FOR UNLOCKING SCREEN OF PORTABLE ELECTRONIC DEVICE - A method executes a gyroscope and an accelerometer function to unlock a portable electronic device. A user can rotate the portable electronic device to set a predetermined rotation direction, a predetermined rotation angle, and a predetermined accelerometer for unlocking the portable electronic device. In an unlock procedure, the gyroscope and the accelerometer detect the rotated portable electronic device and outputs detected information of the portable electronic device. If the rotation direction, the rotation angle, and the acceleration are the same as the predetermined rotation direction, the predetermined rotation angle, and the predetermined acceleration, respectively, the locked portable electronic device is unlocked. | 12-05-2013 |
20130321264 | MOBILE TERMINAL AND CONTROL METHOD FOR THE MOBILE TERMINAL - A mobile terminal including a flexible display unit and a related control method are provided. The mobile terminal may include a flexible display unit configured to display first screen information that is flexible in response to an external physical force, a sensing unit configured to sense flexure of the flexible display unit and a controller configured to control the flexible display unit to output second screen information containing information associated with the first screen information on one region of the flexible display unit in response to the flexure. | 12-05-2013 |
20130321265 | Gaze-Based Display Control - A method includes receiving an image including an eye of a user of a computerized system and identifying, based the image of the eye, a direction of a gaze performed by the user. Based on the direction of the gaze, a region on a display coupled to the computerized system is identified, an operation is performed on content presented in the region. | 12-05-2013 |
20130321266 | APPARATUS, AND ASSOCIATED METHOD, FOR SELECTING INFORMATION DELIVERY MANNER USING FACIAL RECOGNITION - An apparatus, and an associated method, selects a manner by which to deliver received information at a wireless, or other electronic, device. A facial recognition indication is obtained and analyzed. Responsive to the analysis of the facial recognition indication, selection is made of the manner by which to deliver the information. If the facial recognition indication indicates the recipient to exhibit a serious demeanor, the information is provided in aural form, thereby to permit delivery of the information without requiring the recipient to read, or otherwise view, the information. | 12-05-2013 |
20130328760 | Fast feature detection by reducing an area of a camera image - An apparatus and method for a mobile device to reduce computer vision (CV) processing, for example, when detecting features and key points, is disclosed. Embodiments herein reduce the search area of an image or the volume of image data that is searched to detect features and key points. Embodiments limit a search area of a full image to an actual area of interest to the user. This reduction decreases the search area, decreases search time, decreases power consumption, and limits detection to the area of interest to the user. | 12-12-2013 |
20130328761 | PHOTOSENSOR ARRAY GESTURE DETECTION - Photosensor array gesture detection techniques are described. In one or more embodiments, a computing device includes an array of photosensors. The photosensor array can be configured in various ways to measure changes in the amount of light that occur based upon a user's hand position above the photosensor array. In at least some embodiments, capacitance associated with the photosensors is charged and data regarding discharge rates for the sensors is collected that is indicative of the amount of incident light. Sequential changes in the amount of light that is measured across the array of photosensors can be used to determine positioning and/or movement of the user's hand in three dimensions (e.g., track position/motion in three-dimensional (3D) space relative to the computing device.) Accordingly, various gestures can be defined in terms of input obtained via the photosensor array and recognized to trigger corresponding operations by the computing device. | 12-12-2013 |
20130328762 | CONTROLLING A VIRTUAL OBJECT WITH A REAL CONTROLLER DEVICE - Technology is described for controlling a virtual object displayed by a near-eye, augmented reality display with a real controller device. User input data is received from a real controller device requesting an action to be performed by the virtual object. A user perspective of the virtual object being displayed by the near-eye, augmented reality display is determined. The user input data requesting the action to be performed by the virtual object is applied based on the user perspective, and the action is displayed from the user perspective. The virtual object to be controlled by the real controller device may be identified based on user input data which may be from a natural user interface (NUI). A user selected force feedback object may also be identified, and the identification may also be based on NUI input data. | 12-12-2013 |
20130328763 | MULTIPLE SENSOR GESTURE RECOGNITION - Methods for recognizing gestures using adaptive multi-sensor gesture recognition are described. In some embodiments, a gesture recognition system receives a plurality of sensor inputs from a plurality of sensor devices and a plurality of confidence thresholds associated with the plurality of sensor inputs. A confidence threshold specifies a minimum confidence value for which it is deemed that a particular gesture has occurred. Upon detection of a compensating event, such as excessive motion involving one of the plurality of sensor devices, the gesture recognition system may modify the plurality of confidence thresholds based on the compensating event. Subsequently, the gesture recognition system generates a multi-sensor confidence value based on whether at least a subset of the plurality of confidence thresholds has been satisfied. The gesture recognition system may also modify the plurality of confidence thresholds based on the plugging and unplugging of sensor inputs from the gesture recognition system. | 12-12-2013 |
20130328764 | FLEXIBLE DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A flexible display apparatus includes a display unit which displays a screen which includes at least one object; a sensor unit which senses a bent state of the display unit; and a control unit which controls the display unit to display a 3D image of the at least one object deployed in the bending direction of the display unit, if it is determined that the display unit is bent in a predetermined cylinder shape based on the bent state sensed by the sensor unit. | 12-12-2013 |
20130328765 | SIGNAGE SYSTEM AND DISPLAY METHOD BY THE SAME - A signage system includes a plurality of display devices, an image capturing devices and a signage processing device. The signage processing device includes a personal recognition section for recognizing a person imaged in the image data, a data selection section for selecting a data group corresponding to the person recognized by the personal recognition section and a display control section for displaying a signage image based on the data group selected by the data selection section on the display device corresponding to the image data. The display control section displays the continuation of the signage image for the person displayed last time on the display device corresponding to the image capturing device which generates the image data if the personal recognition section recognizes the same person again according to the image data after displaying the signage image for the person on the display device. | 12-12-2013 |
20130328766 | PROJECTION TYPE IMAGE DISPLAY APPARATUS, IMAGE PROJECTING METHOD, AND COMPUTER PROGRAM - Disclosed herein is a projection type image display apparatus, including: a projecting portion projecting an input image on an image projected body; a camera portion photographing the image projected body; a display control block controlling display of a projected image projected by the projecting portion; and a gesture recognizing unit recognizing a gesture manipulation of a user contained in the image photographed by the camera portion. | 12-12-2013 |
20130328767 | INFORMATION PROCESSING APPARATUS, CONFERENCE SYSTEM, AND INFORMATION PROCESSING METHOD - An information processing apparatus includes a voice acquiring unit configured to acquire voice information remarked by a user; an image information acquiring unit configured to acquire image information of the user; a user specifying unit configured to specify the user having remarked the voice information based on the image information when the voice information is acquired; a word extracting unit configured to extract a word from the voice information; a priority calculating unit configured to increase priority of the user having remarked the word for operation of a display device when the extracted word matches any one of keywords; a gesture detecting unit configured to detect a gesture made by the user based on the image information; and an operation permission determining unit configured to permit the user to operate the display device when the priority of the user having made the gesture is highest. | 12-12-2013 |
20130328768 | VEHICULAR TERMINAL - A vehicular terminal receives an input of a condition via an input unit, such that the input may be performed via manual operation or voice recognition. An input restriction function restricts a reception of the condition by manual operation during vehicle travel. The vehicular terminal determines whether an input reception restriction function is in effect during the reception of the condition by manual operation via the input unit. When it is determined that the input reception restriction function is in, the vehicular terminal switches an input method for reception of the input of the condition from manual operation to voice recognition, thereby allowing a continuous reception of the condition. | 12-12-2013 |
20130328769 | APPARATUS AND METHOD FOR INPUTTING COMMAND USING GESTURE - Disclosure is a method of inputting commands into displays such as TVs or image processing devices. User's hands have been photographed through a camera to recognize the motion of the user's hands, so that commands are input according to the motion of user's hands instead of conventional input devices such as a mouse and a keyboard. | 12-12-2013 |
20130335310 | SYSTEM AND METHOD FOR ACTIVATING, ACTIONING AND PROVIDING FEEDBACK ON INTERACTIVE OBJECTS WITHIN LINE OF SIGHT - Systems and methods are provided for line of sight, hands free interaction with interactive objects within a defined space on a line of sight basis. The system can include a base station in communication with one or more interactive objects having a variety of display options. Details about the display options are stored on the system along with the location of the interactive objects and layout of the surroundings of the interactive object. Users can carry devices that are in communication with the base station and that determine the user's location and head orientation and relay that information to the base station. Using the location and orientation of the user, the location of the interactive objects and the layout information about the surrounding area, the system can determine if the user is looking directly at an interactive object and can cause the interactive object to perform an action. | 12-19-2013 |
20130335311 | FLEXIBLE PORTABLE DEVICE - A flexible portable device is disclosed. The flexible portable device includes a display unit for displaying an image, a communication unit for performing communication with an external device, a sensor unit for sensing user input or an environment of the flexible portable device, and a control unit for controlling the flexible portable device and the units of the flexible portable device, wherein the sensor unit includes a motion sensor unit for sensing motion of the flexible portable device and/or motion with respect to the flexible portable device, the flexible portable device has at least one flexible area which is bendable, and the motion sensor unit is located at a first area at which influence on the motion sensor unit when the flexible area is bent is avoided or minimized. | 12-19-2013 |
20130335312 | INTEGRATION OF THIN FILM SWITCHING DEVICE WITH ELECTROMECHANICAL SYSTEMS DEVICE - This disclosure provides systems and methods for thin film switching devices, such as thin film transistors and thin film diodes, which are integrated in a display apparatus. In one aspect, a thin film switching device is positioned on a rear side of an electromechanical systems (EMS) display element formed over a substrate and is in electrical communication with the EMS display element. In another aspect, the thin film switching device is positioned between the EMS display element and the substrate. A planar layer is disposed between the EMS display element and the thin film switching device, with the planar layer having a planar surface. | 12-19-2013 |
20130335313 | PROCESS OF CREATING A DISPLAY, SYSTEM FOR CREATING A DISPLAY, AND MOBILE UNTETHERED DISPLAY DEVICE - A process of creating a display, a system for creating a display, and a mobile untethered display device are disclosed. The process includes providing image data to a first spectator display device, providing image data to a second spectator display device, and activating at least one of one or more light emitting elements based upon the image data. The first spectator display device includes one or more of the light emitting elements. The first spectator display device is a mobile untethered device positioned within a spectator region of a venue and creates a display having an illusion of a continuous image. The system includes a first spectator display device, a second spectator display device, and a controller. The device includes one of one or more light emitting elements capable of being activated upon receiving image data corresponding with a portion of the display. | 12-19-2013 |
20130335314 | Intelligent Reminding Apparatus and Method Thereof - An intelligence reminding apparatus and a method thereof. The intelligence reminding method comprises: using an image capturing unit to preview a preview-image or capture a dynamic image; detecting at least one face image of the preview-image or the dynamic image to obtain a face image; recognizing the face image according to a face photo of a database which comprises at least one face photo and at least one corresponding information; acquiring the corresponding information if the face image matches the face photo. Wherein the face recognition is proceeding during the image preview or dynamic image capturing; reminding the corresponding information to users in a display mode, or reminding in a voice mode via a connected earphone or a connected voice apparatus if the connected earphone or the connected voice apparatus is detected. | 12-19-2013 |
20130335315 | MOBILE HANDSET ACCESSORY SUPPORTING TOUCHLESS AND OCCLUSION-FREE USER INTERACTION - An accessory device for handheld electronic devices such as, for example, a cellular telephone, a smart phone, a media player, and a tablet computer includes circuitry for sensing flow of human breath and for communicating with the handheld electronic device, to enable user control of the handheld electronic device employing the flow of human breath. | 12-19-2013 |
20130335316 | GESTURE BASED CONTROL APPLICATION FOR DATA SHARING - Receiving user gesture input commands and interpreting the commands to conduct presentation level control system processing and related presentation communications includes, in one example, detecting an input gesture command via a controller and processing the input gesture command via a processor. The example may also include retrieving at least one data file object responsive to the processed input gesture command, and transmitting the at least one media object to a remote device. | 12-19-2013 |
20130335317 | APPARATUSES FOR CONTRIBUTIVELY MODIFYING IMAGE ORIENTATION FOR DISPLAY - An embodiment of an apparatus for contributively modifying an image orientation is introduced. The apparatus includes a display angle detection unit configured to detect a display angle setting. A parameter storage unit stores a display parameter being utilized to determine how to modify images. The hot-plugging processing unit changes a hot-plugging signal to cause the external apparatus to recognize that the display processing unit is disconnected therefrom, and modifies the content of the display parameter with a new display parameter value that is consistent with the current display placement when the display angle setting satisfies a display angle refreshing condition. The hot-plugging processing unit changes the hot-plugging signal to cause the external apparatus to recognize that a new apparatus is connecting, thereby enabling the external apparatus to read out the new display parameter value and generate the image stream corresponding to the new display parameter value. | 12-19-2013 |
20130335318 | METHOD AND APPARATUS FOR DOING HAND AND FACE GESTURE RECOGNITION USING 3D SENSORS AND HARDWARE NON-LINEAR CLASSIFIERS - A method of controlling a mobile or stationary terminal comprising of the steps of one of multiple ways for 3D sensing a hand or face, recognizing the visual command input by trained hardware that does not incorporate instruction based programming and then causing some useful function to be performed by the recognized gesture on the terminal. This method is to enhance gross body gesture recognition in practice today. Gross gesture recognition has been made accessible by providing accurate skeleton tracking information down to the location of a person's hands or head. Notably missing from the skeleton tracking data, however, are the detailed positions of the person's fingers or facial gestures. Recognizing the arrangement of the fingers on a person's hand or expression on his or her face has applications in recognizing gestures such as sign language, as well as user inputs that are normally done with a mouse or a button on a controller. Tracking individual fingers or the subtleties of facial expressions poses many challenges, including the resolution of the depth camera, the possibility for fingers to occlude each other, or be occluded by the hand and performing these functions within the power and performance limitations of traditional coded architectures. This unique codeless, trainable hardware method can recognize finger gestures robustly and deal with these limitations. By recognizing facial expressions, additional information like approval, disapproval, surprise, commands and other useful inputs can be incorporated. | 12-19-2013 |
20130335319 | MOBILE DEVICE OPERATION USING GRIP INTENSITY - Mobile device operation using grip intensity. An embodiment of a mobile device includes a touch sensor to detect contact or proximity by a user of the mobile device; a memory to store indicators of grip intensity in relation to the touch sensor; and a processor to evaluate contact to the touch sensor. The processor is to compare a contact with the touch sensor to the indicators of grip shape and firmness to determine grip intensity, and the mobile device is to receive an input for a function of the mobile device based at least in part on determined grip intensity for the mobile device. | 12-19-2013 |
20130335320 | REMOTE CONTROL COMMUNICATION DEVICE AND NAVIGATION DEVICE - Disclosed is a remote control communication device including a data generating unit | 12-19-2013 |
20130342436 | LAPTOP AND PROJECTOR DEVICE - The present invention is a laptop and projector device that includes a laptop computer, a projector incorporated into the laptop computer and a projector eye disposed in front of the projector. The laptop and projector device also includes a hand held remote control that controls the projector eye and the laptop computer and a projector eye cover plate with a pair of diagonal halves placed over the projector eye that open and close while placed over the projector eye. There is also a second embodiment of the laptop and projector device that removably attaches to a laptop computer or other suitable computer. | 12-26-2013 |
20130342437 | IMAGE DATA GENERATION USING A HANDHELD ELECTRONIC DEVICE - The present disclosure provides improved generation of images using a handheld electronic device. Motion of the handheld electronic device is detected using a sensor of the handheld electronic device and data, dependent upon the sensed motion, is transmitted from the device to a remote electronic device. An image, representative of the sensed motion of the handheld electronic device and generated from the transmitted data, is rendered on a display of the remote electronic device and provides a user with visual feedback of the motion of the handheld electronic device. The image data may be generated by the handheld electronic device or by the remote electronic device. | 12-26-2013 |
20130342438 | METHOD AND APPARATUS FOR MEASURING AUDIENCE SIZE FOR A DIGITAL SIGN - Measuring audience size for a digital sign comprises generating a plurality of paths, one for each face detected in a first sequence of video frames captured by a camera proximate the digital sign, and generating a zone in the sequence of video frames through which passes a threshold number of the paths. Motion and direction of motion within the zone is then measured in a second sequence of video frames to calculate the audience size that passes through the zone in the second sequence of video frames. | 12-26-2013 |
20130342439 | FLEXIBLE DISPLAY APPARATUS - A flexible display apparatus includes a substrate that is flexible and is foldable according to an intention of a user, the substrate including a display area, wherein a size of the substrate is variable according to a folding thereof, a deformation sensing unit that is in an overlapping relationship with the display area and that senses deformation of the substrate, a control unit obtaining information from the deformation sensing unit, and a resolution adjusting unit that is controlled by the control unit and adjusts a resolution of an image displayed on the display area. | 12-26-2013 |
20130342440 | INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD - According to one embodiment, an information processing device includes: a display controller configured to display a screen comprising input regions to which information is capable of being input, on a display; a transmitter configured to transmit the information input to at least one of the input regions to an external device; a prohibition module configured to acquire attribute information indicating display or non-display of the information in the input region for each of the input regions and prohibit, when the attribute information of the input region to which the information to be transmitted is input indicates the non-display of the information, transmission of the information to the external device. | 12-26-2013 |
20130342441 | CHARACTER INPUT METHOD AND INFORMATION PROCESSING APPARATUS - A character input method to be executed by a computer, the character input method includes: acquiring a first image from an imaging device; detecting an object from the first image; specifying a character group corresponding to a first position of the object detected from the first image, based on correspondence information in which each of a plurality of character groups including a plurality of characters is associated with each of a plurality of regions set for the first image; acquiring a second image acquired from the imaging device at a time different from a time at which the first image is acquired; detecting the object from the second image; and designating one character from the specified character group, based on a second position of the object detected from the second image. | 12-26-2013 |
20130342442 | INPUT SYSTEM - An input system includes a first gesture detection unit and a second gesture detection unit. The first gesture detection unit includes a first light emitting device for emitting a first light beam, a first light sensing device for receiving the first light beam reflected by a first motion trajectory generated by a user and outputting a first image signal, and a first processing unit for processing the first image signal and outputting a first command signal. The second gesture detection unit includes a second light emitting device for emitting a second light beam, a second light sensing device for receiving the second light beam reflected by a second motion trajectory generated by the user and outputting a second image signal, and a second processing unit for processing the second image signal and outputting a second command signal. | 12-26-2013 |
20130342443 | GESTURE DETECTING APPARATUS AND METHOD FOR DETERMINING GESTURE ACCORDING TO VARIATION OF VELOCITY - The present invention discloses a gesture detecting apparatus which includes an image capturing device and a processing unit. The image capturing device is for capturing an object light beam reflected by an object and outputting corresponding image information accordingly. The processing unit is for processing the image information and generating a first command or a second command accordingly. The steps of generating the first command and the second command by the processing unit include: outputting a first command if an image size of the image information increases with a relatively higher velocity and decreases with a relatively lower velocity in a sequential time series; and outputting a second command if the image size of the image information increases with a relatively lower velocity and decreases with a relatively higher velocity in the sequential time series. | 12-26-2013 |
20130342444 | Method and Apparatus for Hand Gesture Trajectory Recognition - Method and apparatus for hand gesture trajectory recognition. The method includes: pre-defining sixteen hand gesture trajectory models, and classifying the sixteen hand gesture trajectory models into three types of hand gesture trajectory models; judging an input hand gesture trajectory belongs to which type of the three types of hand gesture trajectory models; and recognizing input hand gesture trajectories that belong to different types of hand gesture trajectory models with different methods. | 12-26-2013 |
20130342445 | EXTERNAL RECORDING DEVICE COMBINED WITH INPUT UNIT - Provided is an external recording device including a text input unit capable of inputting text or data to a terminal device. | 12-26-2013 |
20130342446 | IMAGE DISPLAY DEVICE, IMAGE DISPLAY SYSTEM INCLUDING THE SAME, AND METHOD FOR CONTROLLING THE SAME - An image display device includes a display unit that displays images in page units and a memory which stores data, and flashes an import button displayed on the display unit upon receipt of image data, and in response to operation of the import button by a user, stores received image data on a virtual page in the memory corresponding to a blank page that is on display, stores the received image data on a virtual page in the memory corresponding to a blank page following the page on display, or stores the received image data on a virtual page in the memory secured as a new page following the page on display, after which the image display device displays the image data stored on the virtual page as an image on the display unit. This makes it possible to display the received image at a time specified by the user. | 12-26-2013 |
20130342447 | HANDHELD ELECTRONIC DEVICE HAVING GESTURE-BASED CONTROL AND A METHOD OF USING SAME - The present disclosure describes a handheld electronic device having a gesture-based control and a method of using the same. In one embodiment, there is provided a method, comprising: determining from a motion signal a movement type associated with a movement of the electronic device from a number predetermined types of movement; determining whether a cadence parameter is greater than or equal to a first cadence reference level, wherein the cadence parameter is dependent on the movement type; performing a first command when the cadence parameter is greater than or equal to the first cadence reference level; and performing a second command is performed when the cadence parameter is less than the first cadence reference level. | 12-26-2013 |
20130342448 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM PRODUCT - According to one embodiment, an information processing apparatus includes: a detector configured to set a detection area curved in an arc shape to a frame image included in a video that is based on input video data, with reference to a position of a face image included in the video to detect a movement of an operator giving an operation instruction in the detection area; and an output module configured to output operation data indicating the operation instruction given by the movement detected by the detector. | 12-26-2013 |
20130342449 | Cushioned User Inferface Or Control Device - A user interface or control includes a cushion-type support member and a user input member that is interconnected with and carried by the support member. The support member defines an upwardly facing recess, and the input member may be a user interface or control device that is contained within the upwardly facing recess. The support member may be formed to surround the recess about the user interface or control device. The support member may include an air vent that vents air exhausted from the user interface or control device. The user interface or control device may be a laptop computer having a body including a keyboard contained within the recess, and a screen carried by the body. The user interface or control device may alternatively be an electronic input member having an upwardly facing screen, a convertible input member movably mounted to a mounting member, or a game controller. | 12-26-2013 |
20130342450 | ELECTRONIC DEVICES | 12-26-2013 |
20130342451 | Methods, Apparatus, and Article for Force Feedback Based on Tension Control and Tracking Through Cables - A haptic interface system includes a cable based haptic interface device and a controller. The controller receives information related to movement of a grip in real-space and generates a stereoscopic output for a display device. The stereoscopic output includes images of a virtual reality tool whose motions mimic motions of the real-space grip. | 12-26-2013 |
20140002336 | PERIPHERAL DEVICE FOR VISUAL AND/OR TACTILE FEEDBACK | 01-02-2014 |
20140002337 | SINGLE-HANDED FLOATING DISPLAY WITH SELECTABLE CONTENT | 01-02-2014 |
20140002338 | TECHNIQUES FOR POSE ESTIMATION AND FALSE POSITIVE FILTERING FOR GESTURE RECOGNITION | 01-02-2014 |
20140002339 | Surface With Touch Sensors for Detecting Proximity | 01-02-2014 |
20140002340 | SYSTEMS AND METHODS FOR SWITCHING SENSING REGIMES FOR GLOVED AND UNGLOVED USER INPUT | 01-02-2014 |
20140002341 | EYE-TYPING TERM RECOGNITION | 01-02-2014 |
20140002342 | SYSTEM FOR PRESENTING HIGH-INTEREST-LEVEL IMAGES | 01-02-2014 |
20140002343 | DEVICE AND METHOD FOR EYE TRACKING DATA TRIGGER ARRANGEMENT | 01-02-2014 |
20140002344 | DYNAMIC DISPLAY ADJUSTMENT | 01-02-2014 |
20140002345 | ILLUMINATION SYSTEMS INCORPORATING A LIGHT GUIDE AND A REFLECTIVE STRUCTURE AND RELATED METHODS | 01-02-2014 |
20140002346 | HAPTIC FEEDBACK CONTROL SYSTEM | 01-02-2014 |
20140002347 | TOUCH PANEL WITH SAPPHIRE SUBSTRATE AND TOUCH SCREEN | 01-02-2014 |
20140002348 | MEASURING DEVICE AND MEASURING METHOD | 01-02-2014 |
20140002349 | Method of Determining Reflections of Light | 01-02-2014 |
20140002350 | ELECTRONIC DEVICE AND METHOD FOR PROCESSING AN OBJECT | 01-02-2014 |
20140002351 | METHODS AND SYSTEMS FOR INTERACTION WITH AN EXPANDED INFORMATION SPACE | 01-02-2014 |
20140002352 | EYE TRACKING BASED SELECTIVE ACCENTUATION OF PORTIONS OF A DISPLAY | 01-02-2014 |
20140002353 | ADVANCED USER INTERACTION INTERFACE METHOD AND APPARATUS | 01-02-2014 |
20140002354 | ELECTRONIC DEVICE, IMAGE DISPLAY SYSTEM, AND IMAGE SELECTION METHOD | 01-02-2014 |
20140002355 | INTERFACE CONTROLLING APPARATUS AND METHOD USING FORCE | 01-02-2014 |
20140009378 | User Profile Based Gesture Recognition - An embodiment includes a system recognizing a first user via a camera, selecting a profile for the first user, and interpreting the first user's gestures according to that profile. For example, the embodiment identifies a first user, loads his gesture signature profile, and then interprets the first user forming his fist with his thumb projecting upwards as acceptance of a condition presented to the user (e.g., whether the user wishes to turn a tuner to a certain channel). The embodiment recognizes a second user, selects a profile for the second user, and interprets the second user's gestures according to that profile. For example, the embodiment identifies the second user, loads her profile, and then interprets the second user forming her fist with her thumb projecting upwards as the user pointing upwards. This moves an area of focus upwards on a graphical user interface. Other embodiments are described herein. | 01-09-2014 |
20140009379 | CAVITY LINERS FOR ELECTROMECHANICAL SYSTEMS DEVICES - This disclosure provides systems, methods and apparatus for electromechanical systems devices with improved electrical properties and device life span. In one aspect, a conformal antistiction layer is formed within a cavity of an electromechanical systems apparatus over a roughened surface. The conformal antistiction layer can include a dielectric layer. The conformal antistiction layer can include a self-assembled monolayer (SAM) formed over the dielectric layer. The conformal antistiction layer can replicate the roughness of the surface that it is deposited on. | 01-09-2014 |
20140009384 | METHODS AND SYSTEMS FOR DETERMINING LOCATION OF HANDHELD DEVICE WITHIN 3D ENVIRONMENT - The present technology refers to methods for dynamic determining location and orientation of handheld device, such as a smart phone, remote controller or gaming device, within a 3D environment in real time. For these ends, there is provided a 3D camera for capturing a depth map of the 3D environment within which there is a user holding the handheld device. The handheld device acquires motion and orientation data in response to hand gestures, which data is further processed and associated with a common coordinate system. The depth map is also processed to generate motion data of user hands, which is then dynamically compared to the processed motion and orientation data obtained from the handheld device so as to determine the handheld device location and orientation. The positional and orientation data may be further used in various software applications to generate control commands or perform analysis of various gesture motions. | 01-09-2014 |
20140009385 | METHOD AND SYSTEM FOR ROTATING DISPLAY IMAGE - A method and a system for rotating a display image applied to an electronic display device which includes a display image are provided. The method includes at least the following steps: enabling a rotation mode; capturing user information relative to the electronic display device via an image capturing unit; receiving the user information via an analysis unit and generating analysis information according to the user information; and receiving the analysis information via a processing unit and generating a control signal to control the display image to rotate according to the analysis information. | 01-09-2014 |
20140009390 | ARRANGEMENT, METHOD AND COMPUTER PROGRAM FOR CONTROLLING A COMPUTER APPARATUS BASED ON EYE-TRACKING - A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request. | 01-09-2014 |
20140009391 | METHOD AND DEVICE FOR DISPLAYING IMAGES - In one embodiment, the method for displaying images on an image display of a display unit includes detecting a plurality of reference points within the face of at least one person looking at the image display. The detection is performed by use of a detecting unit rigidly coupled to the display unit. The method further includes determining at least one movement component of a relative movement of the display unit with respect to the reference points, generating data of movement compensated images for an at least partial compensation of the movement component with respect to the reference points, and imaging the movement compensated images on the image display. | 01-09-2014 |
20140009392 | ELECTRONIC DEVICE - An electronic device includes a housing having a display, an angular velocity sensor, an acceleration sensor, and a controller. The angular velocity sensor detects an angular velocity around the X axis parallel to the display. The acceleration sensor detects an acceleration along the Z axis, which is perpendicular to the display and orthogonal to the X axis. The controller performs a first process when the angular velocity sensor detects a positive angular velocity first and then detects a negative angular velocity, and the acceleration sensor detects a change in the acceleration along the Z axis. The controller, on the other hand, performs a second process when the angular velocity sensor detects a negative angular velocity first and then detects a positive angular velocity, and the acceleration sensor detects a change in the acceleration along the Z axis. | 01-09-2014 |
20140009393 | System and Method for Interfacing Between a Mobile Device and a Personal Computer - A system and method are provided for interfacing between a mobile device and a PC. The mobile device utilizes a connection with the PC for taking advantage of the larger display and input devices such as the keyboard on the PC to improve the user interface (UI). This also enables the user to take advantage of the mobile device's wireless connectivity at the same time, e.g. where the PC does not have the same connectivity. | 01-09-2014 |
20140015743 | FLEXIBLE DISPLAY APPARATUS AND OPERATING METHOD THEREOF - A flexible display apparatus is provided. The flexible display apparatus includes a display, a sensor which senses shape deformation of the display, a storage which, if a shape deformation is sensed, stores operation state information of a first operation state of the flexible display apparatus prior to the first shape deformation being performed, and a controller which performs a function corresponding to the first shape deformation if a second shape deformation different from the first shape deformation is sensed, returns to the first operation state according to the operation state information stored in the storage. | 01-16-2014 |
20140015744 | CONTROL SYSTEM AND METHOD FOR A DISPLAY - A control system for a display includes a number of motion sensors and a processor. The sensors detect objects and movement thereto and issue signals accordingly, the processor controlling the display to turn off after a set period without movement, and turn on again according to the detection of further movement, to save unnecessary display power. | 01-16-2014 |
20140015745 | APPARATUS AND METHOD FOR DETECTING AND HANDLING FLEXION STATES OF FLEXIBLE DISPLAY - Disclosed is a flexion state detecting and handling method and apparatus for detecting and handling a flexion state of a flexible display. A sensing unit includes a bend sensor array having a plurality of bend sensors arranged on the flexible display in a predetermined form. Each of the bend sensors detects a degree of bending as the bend sensor is bent along a flexion of the flexible display at a point where the bend sensor is arranged. A controller generates flexion state information indicating a flexion state of the flexible display based on the degree of bending detected by the bend sensors. | 01-16-2014 |
20140022157 | METHOD AND DISPLAY APPARATUS FOR PROVIDING CONTENT - A method and display apparatus for providing content are provided. The method for providing content includes displaying a content UI in which a plurality of content is included, detecting an area at which a user gazes in the content UI, determining a preference for gaze content that is present in the area at which the user gazes by measuring user's brainwaves, and providing the content UI based on the determined preference. The user can be provided with the content UI that is configured more intuitively and conveniently according to the user preference. | 01-23-2014 |
20140022158 | APPARATUS AND METHOD FOR CONTROLLING OPERATION MODE OF DEVICE USING GESTURE RECOGNITION - An apparatus and method for controlling an operation mode of a device using gesture recognition using temporal change of light intensity. | 01-23-2014 |
20140022159 | DISPLAY APPARATUS CONTROL SYSTEM AND METHOD AND APPARATUS FOR CONTROLLING A PLURALITY OF DISPLAYS - A display apparatus control system and a method and apparatus for controlling a plurality of display apparatuses are provided. The method of controlling a plurality of display apparatuses includes capturing a region at which a user is located, generating a user-captured image, determining which display apparatus the user is looking at from among the plurality of display apparatuses, based on the user-captured image, and determining the determined display apparatus to be a control target display apparatus, and transmitting a control signal which corresponds to a user's command to be the control target display apparatus when the user's command is input. | 01-23-2014 |
20140022160 | SYSTEM AND METHOD FOR INTERACTING WITH A COMPUTING DEVICE - One or more computing devices may detect the presence of a user, determine default communication mode for communicating with the user, use the default communication mode to elicit a communicative gesture from the user, receive a signal identifying at least one communicative gesture from the user, and convert the identified communicative gestures into a command. The default communication mode may be based on the identity of the user. Alternatively, the default communication mode can be determined by using a first mode of communication to prompt the user to make a mode-selection gesture, and if a mode-selection gesture is detected from the user within a threshold period of time, setting the first mode of communication as the default communication mode. Otherwise, if no mode-selection gesture is detected within a threshold period of time, using an alternative mode of communication to prompt the user to make a mode-selection gesture. | 01-23-2014 |
20140022161 | HUMAN TRACKING SYSTEM - An image such as a depth image of a scene may be received, observed, or captured by a device. A grid of voxels may then be generated based on the depth image such that the depth image may be downsampled. A background included in the grid of voxels may also be removed to isolate one or more voxels associated with a foreground object such as a human target. A location or position of one or more extremities of the isolated human target may be determined and a model may be adjusted based on the location or position of the one or more extremities. | 01-23-2014 |
20140022162 | SYSTEMS AND METHODS FOR ELICITING CUTANEOUS SENSATIONS BY ELECTROMAGNETIC RADIATION - The present disclosure provides various systems and methods for inducing cutaneous sensations by delivering electromagnetic radiation to directly or indirectly excite neural tissue. An electromagnetic radiation source, such as one or more infrared lasers, may be used to transcutaneously excite neural tissue. The excited neural tissue may be interpreted by the user's nervous system as cutaneous sensations. Accordingly, a system as described herein may be used to induce sensations to allow actual cutaneous sensations to be simulated. A system for inducing a cutaneous sensation via transcutaneously focused electromagnetic radiation may be incorporated in a display to provide cutaneous sensation feedback or used as a separate accessory component associated with a display. Numerous additional applications and variations are provided herein. | 01-23-2014 |
20140022163 | WEARABLE DEVICE WITH INPUT AND OUTPUT STRUCTURES - An electronic device including a frame configured to be worn on the head of a user is disclosed. The frame can include a bridge configured to be supported on the nose of the user and a brow portion coupled to and extending away from the bridge and configured to be positioned over a side of a brow of the user. The frame can further include an arm coupled to the brow portion and extending to a free end. The first arm can be positionable over a temple of the user with the free end disposed near an ear of the user. The device can also include a transparent display affixed to the frame adjacent the brow portion and an input affixed to the frame and configured for receiving from the user an input associated with a function. Information related to the function can be presentable on the display. | 01-23-2014 |
20140022164 | REAL TIME HAND TRACKING, POSE CLASSIFICATION, AND INTERFACE CONTROL - A hand gesture from a camera input is detected using an image processing module of a consumer electronics device. The detected hand gesture is identified from a vocabulary of hand gestures. The electronics device is controlled in response to the identified hand gesture. This abstract is not to be considered limiting, since other embodiments may deviate from the features described in this abstract. | 01-23-2014 |
20140022165 | TOUCHLESS TEXT AND GRAPHIC INTERFACE - The present invention relates to a method for a user to type text on a computer screen using wireless actuators attached to the user's fingers. The image of a virtual keyboard and the user's virtual fingers appears on the computer screen. As the user moves his fingers the virtual fingers on screen moves accordingly aiding the user to type. The actuators transmit symbol information to the computer indicative of a key virtually struck on the virtual keyboard by the user's fingers. Text appears on screen. Virtual typing emulates typing on a physical keyboard. In other embodiments the actuators are coupled to other parts of the body for virtual typing. | 01-23-2014 |
20140028538 | FINGER MOTION RECOGNITION GLOVE USING CONDUCTIVE MATERIALS AND METHOD THEREOF - According to one embodiment, a finger motion recognition glove using conductive materials configured to detect the bending of fingers using a characteristic in which the glove which is made of conductive fibers. The finger motion recognition glove includes pairs of contacts, positioned on corresponding pairs of locations on the glove where knuckles of fingers are bent, each pair of contacts coupled to a first surface of the glove, the finger region between each of the pairs of contacts having a resistance value that changes as the corresponding finger region of the glove is bent and unbent. | 01-30-2014 |
20140028539 | ANATOMICAL GESTURES DETECTION SYSTEM USING RADIO SIGNALS - A system for detecting anatomical gestures based on the interpretation of radio signal transmissions. A user may place wireless devices on his/her head and wrist(s) that communicate through short-range radio signals. The wireless devices may collect information regarding signal exchanges which may be analyzed by a computing device to determine positional information about the user's hands. The computing device may compile the positional information into sequences and evaluate the sequences against predefined patterns of movement data. The computing device may interpret recognized movements as computer input commands. In an embodiment, multiple wireless devices may be placed on the user's wrists to enable multi-touch input commands. In an embodiment, data from motion sensors, such as accelerometers and gyroscopes, may be combined with movement data based on radio signal information. | 01-30-2014 |
20140028540 | HAND-HELD COMMUNICATION DEVICES WITH FINGER NAVIGATION USER INTERFACE - Hand-held communication devices ( | 01-30-2014 |
20140028541 | Direction Based User Interface and Digital Sign Display - Examples disclosed herein relate to a direction based user interface and digital sign display. A processor may detect a facing direction of a display device. The processor may cause a user interface to be displayed on the display device if the display device is detected to be facing a first direction. The processor may cause a digital sign image to be displayed on the display device if the display device is detected to be facing a second direction. | 01-30-2014 |
20140028542 | Interaction with Devices Based on User State - A device identifies users who is trying to interact with it and monitors their state. Using this user state information, the device dynamically adjusts a user interface, speech grammars, screen flow, input options, and the like to tailor interaction with the device to the user. The user interface and device interaction may also be influenced by user-specific settings or profiles. The device may prevent or allow user interaction with the device based on the user's state, such as a position relative to the device. For example, the device may prevent or allow the user from using a set of speech or gesture commands or other interaction sets based on the user's position or relative location. | 01-30-2014 |
20140028543 | COMPLEX PASSIVE DESIGN WITH SPECIAL VIA IMPLEMENTATION - This disclosure provides systems, methods and apparatus for vias in an integrated circuit structure such as a passive device. In one aspect, an integrated passive device includes a first conductive trace and a second conductive trace over the first conductive trace with an interlayer dielectric between a portion of the first conductive trace and the second conductive trace. One or more vias are provided within the interlayer dielectric to provide electrical connection between the first conductive trace and the second conductive trace. A width of the vias is greater than a width of at least one of the conductive traces. | 01-30-2014 |
20140028544 | STORAGE MEDIUM AND INFORMATION PROCESSING APPARATUS, METHOD AND SYSTEM - An example non-limiting game apparatus includes a stereoscopic LDC on which an image of a displaying range that is a part of a course provided in a virtual game space is displayed as a game screen. Authorization for scrolling the game screen is applied to one player object out of a plurality of player objects, and the game screen is scrolled in accordance with a position of the player object having the scroll authorization. If a predetermined condition is satisfied, the scroll authorization is transferred to the player object which causes that the predetermined condition is satisfied, for example. | 01-30-2014 |
20140028545 | ELECTRONIC DEVICE WITH FUNCTION OF ADJUSTING DISPLAY RESOLUTION OR BRIGHTNESS AND METHOD - An electronic device and a method for adjusting display resolution or brightness employed in a display are provided. The device includes a screen, a storage unit, a sensing unit, and a control unit. The storage unit stores a preset optimized distance and a first preset distance. The sensing unit senses a face of a user in the front of the device and a distance between the user and the device in real time. The control unit determines whether the distance becomes greater than the preset optimized distance, gradually decreases a resolution or brightness of the screen when the distance becomes greater than the preset optimized distance and is increasing, and gradually increases the resolution or brightness of the screen when the distance has reached the first preset distance and then decreases to become less than the first preset distance and keeps decreasing. | 01-30-2014 |
20140028546 | TERMINAL AND CONTROL METHOD THEREOF - A terminal wearable by a user and a control method thereof are provided. The terminal includes: a body configured to wrap at least one region of a wrist and detachably formed; a sensing unit disposed in one surface of the body and configured to sense a movement of at least one of tendons passing through the wrist and the wrist; and a controller configured to generate a control signal for controlling an external device to execute a function previously matched to the sensed movement of the at least one of the tendons and the wrist. | 01-30-2014 |
20140028547 | SIMPLE USER INTERFACE DEVICE AND CHIPSET IMPLEMENTATION COMBINATION FOR CONSUMER INTERACTION WITH ANY SCREEN BASED INTERFACE - A user control device operates with a variety of host systems, including computers, televisions and recorded or streaming video playback devices, and gaming systems, is mounted to the user's hand. The user control device includes audio and optical sensors for capturing audio and image or video data, allowing the use of voice commands and display focal center alignment control for “swiping” or scrolling the display. A combined inertial (accelerometer(s), gyroscope(s) and a magnetometer) sensor detects translation and rotation movement of the user control device for pointing and selecting within real or virtual three-dimensional space. Haptic (e.g., vibration) feedback units provide tactile feedback to the user to confirm double clicks and similar events. | 01-30-2014 |
20140028548 | GAZE DETECTION IN A 3D MAPPING ENVIRONMENT - A method, including receiving a three-dimensional (3D) map of at least a part of a body of a user ( | 01-30-2014 |
20140028549 | IMAGE MANIPULATION BASED ON TRACKED EYE MOVEMENT - The disclosure relates to controlling and manipulating an image of an object on a display device based on tracked eye movements of an observer. When the object is displayed according to an initial view, the observer's eye movement is tracked and processed in order to determine the focus of the observer's attention or gaze on the image. Thereafter, the displayed image is modified to provide a better view of the part of the object in which the observer is most interested. This is accomplished by modifying at least one of the spatial positioning of the object within the viewing area, the angle of view of the object, and the viewing direction of the object. | 01-30-2014 |
20140028550 | REAL TIME HAND TRACKING, POSE CLASSIFICATION, AND INTERFACE CONTROL - A hand gesture from a camera input is detected using an image processing module of a consumer electronics device. The detected hand gesture is identified from a vocabulary of hand gestures. The electronics device is controlled in response to the identified hand gesture. This abstract is not to be considered limiting, since other embodiments may deviate from the features described in this abstract. | 01-30-2014 |
20140028551 | ADJUSTING PROXIMITY THRESHOLDS FOR ACTIVATING A DEVICE USER INTERFACE - A smart-home device includes a user interface including an electronic display having a first display mode and a second display mode, the first display mode generally requiring more power than said second display mode. The device also includes a processing system in operative communication with one or more environmental sensors for determining at least one environmental condition. The device additionally includes at least one sensor configured to detect a physical closeness of a user to the at least one sensor. The processing system may be configured to cause the electronic display to be in the first display mode when a closeness threshold has been exceeded, where the processing system is further configured to automatically adjust the closeness threshold based at least in part on a historical plurality of physical closeness events as detected by the at least one sensor. | 01-30-2014 |
20140035803 | REMOTE ACTUATION SYSTEM FOR A HUMAN/MACHINE INTERFACE - A remote actuation system for de-manning a human/machine interface emulates the interface protocol to interact with the interface at a mechanical level. The system may function autonomously or with a remote human operator. At least one imaging module and at least one switch module with the same relative positions as and a complementary relief to the interface's gauge(s) and mechanical switch(es) are mounted on the instrument panel. The modules may be individually mounted or provided on a remote actuation panel that fits over the instrument panel. A data cable connects the imaging and switch modules to a processing module having at least one computer processor configured to execute an image-processing sub-module and a switch sub-module to emulate the protocol to interact with the interface at a mechanical level. The estimate of the measured value may be fed back to generate the switch command to control the remote switch actuator. | 02-06-2014 |
20140035804 | METHOD, APPARATUS AND COMPUTER PROGRAM PRODUCT FOR PRESENTING DESIGNATED INFORMATION ON A DISPLAY OPERATING IN A RESTRICTED MODE - A method is provided for presenting information on a display to a user when the display is operating, at least in part, in a locked or low-power mode. In particular, an example method may include providing for operation of a device in an unrestricted mode, and receiving an indication that an area presented on a display of the device is a designated area, where another area of the display that is not the designated area is an undesignated area. The designated area may include designated information while the undesignated area may include undesignated information. The method may further include providing for operation of the device in a restricted mode, where the designated information is presented in the restricted mode and the undesignated information is not presented in the restricted mode. | 02-06-2014 |
20140035805 | SPATIAL OPERATING ENVIRONMENT (SOE) WITH MARKERLESS GESTURAL CONTROL - A Spatial Operating Environment (SOE) with markerless gestural control includes a sensor coupled to a processor that runs numerous applications. A gestural interface application executes on the processor. The gestural interface application receives data from the sensor that corresponds to a hand of a user detected by the sensor, and tracks the hand by generating images from the data and associating blobs in the images with tracks of the hand. The gestural interface application detects a pose of the hand by classifying each blob as corresponding to an object shape. The gestural interface application generates a gesture signal in response to a gesture comprising the pose and the tracks, and controls the applications with the gesture signal. | 02-06-2014 |
20140035806 | Conductive Fingertip Assembly - The present invention describes a data input assembly for use with touch screen electronic devices. The assembly includes a set of finger-tip attachments shaped to correspond to the thumb, pointer, and middle fingers of a user The assembly has insulating and conducting layers and enables a user to utilize one or more fingers simultaneously to aid in the use of more complex applications such as playing games and zooming in on text and images. The conducting layer is configured to be inserted in slits within the sleeve of the assembly and when the device comes in contact with the touchscreen of an electronic device, the fingertip assembly transfers the charge from the finger to the electronic device. The assembly further functions to reduce smudges and the transfer of germs onto an electronic device. | 02-06-2014 |
20140035807 | AMBIENT LIGHT SENSING DEVICE AND METHOD, AND INTERACTIVE DEVICE USING SAME - The invention provides an ambient light sensing device which receives at least one visible light image sensed by an image sensor. The ambient light sensing device includes an image sampling unit and an analyzing unit. The image sampling unit divides the visible light image into plural image blocks, extracts at least one sample data in each image block, and generates a comparison data according to a difference between the sample data extracted at different time points. The analyzing unit analyzes the comparison data and generates an output analysis signal accordingly. | 02-06-2014 |
20140035808 | ELECTRONIC DEVICE AND METHOD FOR REGULATING DISPLAY SCREEN - In a method for regulating the display on a screen of an electronic device, an initial state of a headpiece on a user is detected when the electronic device is powered on, and any deviation direction or deviation angle of the headpiece is determined by comparing a current orientation of the headpiece and an initial state thereof. A regulating direction and a regulating angle of the display screen or a displayed image on the display screen are determined according to any deviation direction and deviation angle such that when a deviation angle is greater than a threshold value, the physical display screen itself or the displayed image is adjusted according to the determined regulating direction and the determined regulating angle. | 02-06-2014 |
20140035809 | Hexahedral Mesh Generator - A hexahedral mesh generator for an analysis model generation target includes an existing analysis model database that stores shape data of existing analysis models before and after shape decomposition, a shape decomposition part extracting module that compares the shapes existing analysis models before and after the shape decomposition, a shape comparison module that compares each shape decomposition part with the analysis model generation target and checks whether the shape decomposition part coincides with at least part of the analysis model generation target, a coinciding shape decomposition part display/selection module that displays coinciding shape decomposition parts and outputs shape decomposition parts selected by the user, a shape decomposition module that decomposes the shape of the analysis model generation target in the same way as the outputted shape decomposition parts, and a hexahedral mesh generating module that generates the hexahedral mesh for the analysis model generation target after undergoing the shape decomposition. | 02-06-2014 |
20140035810 | Electronic Device - An electronic device includes a first body with a first face and a second face opposite to the first face; a second body with a third face and a fourth face opposite the third face; a first connecting unit connected to the first and second body and set at a first edge of the first face. The first body can be rotated relative to the second body through the first connecting unit. The electronic device is in a first state when an angle between the first and second body is in a first interval and in a second state when the angle is in a second interval. A first displaying unit is set on the first face; a display direction of which is a first direction from the first edge to a second edge opposite to the first edge when the electronic device is in the second state. | 02-06-2014 |
20140035811 | CONTROL MODULE AND CONTROL METHOD TO DETERMINE PERSPECTIVE IN THE RENDERING OF MEDICAL IMAGE DATA SETS - In a control method and a control unit for context-specific determination of at least one DESIRED perspective upon rendering of medical image data at a monitor, a graphical symbol is generated at a user interface dynamically and depending on an image status in order to detect a perspective control signal, and is used to control the rendering. | 02-06-2014 |
20140035812 | GESTURE SENSING DEVICE - A gesture sensing device having a single illumination source is disclosed. In one or more implementations, the gesture sensing device includes a single illumination source configured to emit light and a light sensor assembly configured to detect the light reflected from an object and to output time dependent signals in response thereto. The gesture sensing device also includes a processing circuit coupled to the light sensor assembly and configured to analyze the time dependent signals received from the light sensor assembly to determine object directional movement proximate to the light sensor assembly. | 02-06-2014 |
20140035813 | INPUT DEVICE, INPUT METHOD AND RECORDING MEDIUM - An obtainer obtains information indicating a distance between a display and a user picked up by a camera. A display controller displays, in the display screen of the display, an operation screen having the size set in accordance with the information obtained by the obtainer. A determiner determines an operation instruction from the user based on at least one of the motion of the user and the shape picked up by the camera. An executor executes a process in accordance with the operation instruction determined by the determiner. | 02-06-2014 |
20140043223 | CIRCUITS FOR CONTROLLING DISPLAY APPARATUS - An apparatus includes a plurality of display elements arranged in an array and a control matrix coupled to the plurality of display elements to communicate data and drive voltages to the display elements. For each display element, the control matrix includes an actuation circuit coupling a voltage source to the display element. The control matrix is configured to apply an actuation voltage to an actuator of the display element throughout an actuation stroke of the actuator and to initiate the actuation of the actuator after a pre-charging signal that initiated the application of the actuation voltage to the actuator has been deactivated. | 02-13-2014 |
20140043224 | Input Device and Host Used Therewith - An input device and a host used therewith are provided. The input device comprises a sensor and a transmitter. The sensor senses a physiological feature of a user, and generates a sensing signal. The transmitter, which is electrically connected to the sensor, transmits the sensing signal to the host so that the host identifies the user according to the sensing signal and performs a system configuration according to a setting data corresponding to the user. | 02-13-2014 |
20140043225 | CONTROL METHOD AND ELECTRONIC DEVICE AND SYSTEM EMPLOYING THE SAME - An electronic device for controlling a display device includes a computing module, an adjusting module and a first communication module. The computing module computes a distance D between the electronic device and the display device capable of communicating with the electronic device. The adjusting module generates an adjusting signal according to the distance D. The first communicating module is configured to transmit the adjusting signal to the display device, for adjusting the size of an image displayed on the display device corresponding to the adjusting signal. | 02-13-2014 |
20140043226 | PORTABLE DEVICE AND ASSOCIATED CONTROL METHOD - A portable device and associated control method are provided. The portable device includes a foldable display panel. The control method includes steps of: detecting a folding operation is applied to the display panel; retrieving at least one folding signal; converting a display region of the display panel from an original size to a folded size according to the at least one folding signal; and the display panel displaying an image according to the converted display region. The display panel selects a corresponding folding coordinate system according to the converted display region. | 02-13-2014 |
20140043227 | FAST WAKE-UP IN A GAZE TRACKING SYSTEM - A gaze tracking system, leaving a low power mode in response to an activation signal, captures an initial burst of eye pictures in short time by restricting the image area of a sensor, with the purpose of enabling an increased frame rate. Subsequent eye pictures are captured at normal frame rate. The first gaze point value is computed memorylessly based on the initial burst of eye pictures and no additional imagery, while subsequent values may be computed recursively by taking into account previous gaze point values or information from previous eye pictures. The restriction of the image area may be guided by a preliminary overview picture captured using the same or a different sensor. From the gaze point values, the system may derive a control signal to be supplied to a computer device with a visual display. | 02-13-2014 |
20140043228 | System and method for providing haptic stimulus based on position - A method of producing a haptic effect includes receiving a sensory content signal from a user interface device, receiving a sensor signal of a body position of a first body part of a user with respect to a second body part of the user, generating the haptic effect using the sensory content signal and the sensor signal, and applying a drive signal to a haptic actuator to produce the haptic effect. | 02-13-2014 |
20140043229 | INPUT DEVICE, INPUT METHOD, AND COMPUTER PROGRAM - An input device detects a sight line position using an elliptical parameter method based on captured data of two cameras of a left eye and a right eye. In the elliptical parameter method, since the accuracy becomes low as an ellipse approximates a circle, weighting is performed in advance for a section on a display image indicated by a sight line. For example, a weight for a section D5 in the nearest distance from a normal line position is set to 0.3. Next, a weight for sections, which distances from the normal line position are next to the section D5, is set to 0.5. Last, a weight for sections in the remotest distance from the normal line position is set to 0.8. Last, a center sight line coordinate value according to both eyes is calculated based on a determined sight line coordinate value corresponding to both cameras. | 02-13-2014 |
20140043230 | Three-Dimensional User Interface Session Control - A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state. | 02-13-2014 |
20140043231 | INFORMATION DISPLAY DEVICE, CONTROL METHOD, AND PROGRAM - An information display device includes: an orientation change degree computation unit that computes a degree of change of a state of orientation detected by an orientation detection unit; an orientation change degree determination unit that determines whether the degree of change of the state of orientation of the information display device computed by the orientation change degree computation unit is equal to or more than a threshold; an image capture unit controller that activates the imaging unit upon the orientation change degree determination unit determining that the degree of change of the state of orientation of the information display device is equal to or more than the threshold; a face detection unit that detects a person's face in an image obtained by imaging of the imaging unit; a face direction specification unit that specifies a vertical orientation of the person's face detected by the face detection unit; a display direction determination unit that determines whether a vertical orientation of a screen display that is displayed on a display unit is the same orientation as the vertical orientation of the person's face specified by the face direction specification unit; and a display direction switch unit that switches the vertical orientation of the screen display that is displayed on the display unit to the same orientation as the vertical orientation of the person's face upon the display direction determination unit determining that the vertical orientation of the screen display that is displayed on the display unit and the vertical orientation of the person's face are not the same orientation. | 02-13-2014 |
20140043232 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND RECORDING MEDIUM - An inputter inputs a captured image of a hand captured by a camera. An acquirer acquires information indicating the distance between the camera and the hand. A storer stores reference data for specifying a hand gesture and a command corresponding to the gesture, for each distance between the camera and the hand. A selector selects, from among the reference data stored in the storer, reference data corresponding to the distance indicated by the information acquired by the acquirer. A specifier refers to the reference data selected by the selector, specifies a hand gesture in the captured image input by the inputter, and specifies a command corresponding to the specified gesture. | 02-13-2014 |
20140049461 | FULLY AUTOMATIC SIMULATION SYSTEM OF AN INPUT DEVICE - A fully automatic simulation system for an input device permits storage in advance of executable applications and associated simulation setting flies into a database, and then combination of the detection, automatic data searching and matching, transmission and conversion, enabling rapid and convenient operation by the users, whenever they operate various applications or whether they adopt a keyboard, mouse or joystick as the simulation controller. | 02-20-2014 |
20140049462 | USER INTERFACE ELEMENT FOCUS BASED ON USER'S GAZE - A computerized method, system for, and computer-readable medium operable to: determine a set of coordinates corresponding to a user's gaze; determine a user interface (UI) element corresponding to the set of coordinates; return that UI element as being detected and again repeating the determination of the set of coordinates corresponding to the user's gaze; determine if the UI element being returned is the same for a predetermined threshold of time according to a started timer; if the UI element is not the same, reset the started timer and again repeating the determination of the set of coordinates corresponding to the user's gaze; and if the UI element is the same, making the UI element active without requiring any additional action from the user and currently selecting the UI element to receive input. | 02-20-2014 |
20140049463 | FLEXIBLE DISPLAY APPARATUS AND FEEDBACK PROVIDING METHOD THEREOF - A flexible display apparatus is provided. The flexible display apparatus includes a sensor configured to sense bending of the flexible display apparatus, a feedback provider configured to provide feedback according to the sensed bending, and a controller configured to control the feedback provider to provide the feedback when a degree of the sensed bending exceeds a threshold value. | 02-20-2014 |
20140049464 | FLEXIBLE DISPLAY APPARATUS AND CONTROLLING METHOD THEREOF - A flexible display apparatus is provided. The flexible display apparatus includes a display configured to display content on a screen, a sensor configured to detect bending of the display from a first form to a second form, and a controller configured to reconstruct the content based on the bending and to display the reconstructed content in a first screen generated in one region of the display when it is determined that the display is restored to the first form. | 02-20-2014 |
20140049465 | GESTURE OPERATED CONTROL FOR MEDICAL INFORMATION SYSTEMS - The embodiments described herein relates to systems, methods and apparatuses for facilitating gesture-based control of an electronic device for displaying medical information. According to some aspects, there is provided a gesture recognition apparatus comprising at least one processor configured receive image data and depth data from at least one camera; extract at least one gesture from the image data and the depth data that is indicative of an activity of an operator within a volume of recognition, the volume of recognition being indicative of a sterile space proximate to the operator; generate at least one command that is compatible with the at least one electronic device based on the extracted at least one gesture; and provide the at least one compatible command to at least one electronic device as an input command. | 02-20-2014 |
20140049466 | METHODS AND SYSTEMS FOR CONTACTLESSLY CONTROLLING ELECTRONIC DEVICES ACCORDING TO SIGNALS FROM A DIGITAL CAMERA AND A SENSOR MODULE - An embodiment of a method for contactlessly controlling an electronic apparatus, performed by a processor of the electronic apparatus. A camera module of the electronic apparatus is turned on to capture a series of images upon detecting that an object is in close proximity to the electronic apparatus. A contactless control procedure is performed while the camera module thereof is in use. The control operation comprises determining a control operation according to the captured images; performing the control operation to an electronic device of the electronic apparatus upon obtaining an instruction based on analyzing the captured images; and turning off the camera module responsive to not obtaining an instruction within a predetermined time period. | 02-20-2014 |
20140055337 | DEVICE EYE TRACKING CALIBRATION - Described herein are techniques and mechanisms for device eye tracking calibration. According to various embodiments, a user interface activation screen for activating a user interface may be presented at a computing device. The user interface activation screen may include an eye tracking calibration affordance configured for calibrating eye tracking at the computing device. The eye tracking calibration affordance may be displayed at a designated location on the user interface activation screen. Eye tracking information may be received via an optical sensor at the computing device. The eye tracking information may describe a state of one or both eyes of an individual located proximate to the computing device during activation of the affordance. The eye tracking information may be compared with the designated location to calibrate eye tracking at the computing device. The user interface may be activated. | 02-27-2014 |
20140055338 | GLOVE-BASED USER INTERFACE DEVICE - Interface devices and methods of use include a flexible frame configured to be worn by a user; one or more sensor wire bundles, each including a plurality of sensor wires connected at one end to a control module. The control module is configured to measure electrical changes in the sensor wire bundles and further configured to indicate a direction and speed of motion of a part of the user's body past the one or more sensor wire bundles based on the measured electrical changes in the sensor wire bundles. | 02-27-2014 |
20140055339 | ADAPTIVE VISUAL OUTPUT BASED ON MOTION COMPENSATION OF A MOBILE DEVICE - Systems, storage medium, and methods associated with motion compensation of visual output on a mobile device are disclosed herein. In embodiments, a storage medium may have instructions to enable the mobile device to acquire data associated with motion of an environment in which the mobile device may be situated. The instructions may also enable the mobile device to calculate motion compensation for at least a portion of visual output of an application of the mobile device. The instruction may enable the mobile device to calculate motion compensation based at least in part on the data associated with motion, for use by the application to adapt at least the portion of visual output of the application. Other embodiments may be disclosed or claimed. | 02-27-2014 |
20140055340 | REDUCING DISTORTION IN AN IMAGE SOURCE COMPRISING A PARALLAX BARRIER - A method for reducing distortion from a stereoscopic image source comprising a parallax barrier layer is described. Embodiments include: capturing one or more audience images of at least one viewer with an image capturing device; determining, at a processor, locations of a plurality of eyes of the at least one viewer based on the audience images; determining incidental angles of the plurality of eyes; and selecting at least one distortion correction filter from a plurality of distortion correction filters based on the incidental angles of the plurality of eyes. | 02-27-2014 |
20140055341 | CONTROL SYSTEM AND METHOD THEREOF - A control system for controlling an external device based on images obtained by a control device comprises a detecting module, an obtaining module, and a determining module. The detecting module generates an obtaining signal when the control device establishes a communication with the external device. The obtaining module obtains an image of the current environment around the external device in response to the obtaining signal. The determining module determines whether at least one predefined object contained in the obtained image. When there is no predefined object contained in the obtained image, the determining module further determines whether the external device is in a first state and generates a first control signal when the external device is not in the first state. The external device switches to a first state for powering off the backlight of the external device in response to the first control signal. | 02-27-2014 |
20140055342 | GAZE DETECTION APPARATUS AND GAZE DETECTION METHOD - A gaze detection apparatus includes: a first imaging unit which has a first angle of view and generates a first image; a second imaging unit which has a second angle of view narrower than the first angle of view, and generates a second image; a face detection unit which detects from the first image a face region; a coordinate conversion unit which identifies on the second image a first region corresponding to the face region or to an eye peripheral region containing the user's eye; a Purkinje image detection unit which detects a corneal reflection image of a light source and the center of the user's pupil from within an eye region, identified based on the first region; and a gaze detection unit which detects the user's gaze direction or gaze position based on a positional relationship between the center of the pupil and the corneal reflection image. | 02-27-2014 |
20140055343 | INPUT METHOD AND APPARATUS OF PORTABLE DEVICE - An input apparatus for use in a portable device is provided. The input apparatus includes a camera for capturing an image; a storage unit which stores a key-hand mapping table for mapping segments of a hand to a plurality of keys, respectively, according to predetermined criteria; a display unit displaying the captured image during an input mode; and a control unit which activates the camera in the input mode, controls the display unit to display the captured image, assign the plural keys to the segments of the hand based on mapping information of the key-hand mapping table, detect an image change in one of the segments displayed on the display unit, and input the key mapped to the segment at which the image change is detected on the display unit. | 02-27-2014 |
20140055344 | METHOD OF ESTABLISHING COMMUNICATION LINK AND DISPLAY DEVICES THEREOF - Provided is a method of establishing, by a first display device, a communication link with a second display device, the method including operations of detecting a first bending motion occurring in the first display device; obtaining information about a second bending motion occurring in the second display device; and establishing a communication link for a data exchange with the second display device, based on a start time of the first bending motion and a start time of the second bending motion. | 02-27-2014 |
20140055345 | FLEXIBLE APPARATUS AND CONTROL METHOD THEREOF - A flexible apparatus is provided. The flexible apparatus includes: a sensor configured to sense bending of the flexible apparatus; and when it is determined that a rubbing gesture of rubbing a plurality of different areas of the flexible apparatus is performed based on a result of the sensing, a controller configured to perform an operation corresponding to the rubbing gesture. | 02-27-2014 |
20140055346 | INPUT DEVICE - An input device for triggering a function of an electronic device comprises a humidity sensor ( | 02-27-2014 |
20140055347 | IMAGING TASK PIPELINE ACCELERATION - Systems, methods, and articles of manufacture for imaging task pipeline acceleration are provided. Imaging tasks in a pipeline of a system having heterogeneous processing capabilities, for example, may be configured to increase the speed at which such imaging tasks are accomplished. | 02-27-2014 |
20140055348 | INFORMATION PROCESSING APPARATUS, IMAGE DISPLAY APPARATUS, AND INFORMATION PROCESSING METHOD - According to an illustrative embodiment an information processing apparatus is provided. The apparatus includes a processor for controlling the displaying of a display object including a main display and an attached information display, and for controlling changing a display state of the attached information display based on movement of a viewer. | 02-27-2014 |
20140055349 | INFORMATION PROCESSING DEVICE, METHOD AND COMPUTER-READABLE NON-TRANSITORY RECORDING MEDIUM - An information processing device includes: a user information acquiring unit which acquires position information on a plurality of users in a range from which a directional display device is viewable; a user operation detecting unit which detects a first operation by a first user and a second operation by a second user; a response acquiring unit which acquires a response for notifying a user that an operation has been detected; and a display control unit which causes a response to the first operation be displayed by the directional display device so as to be visible from a position indicated by the position information on the first user and which causes a response to the second operation to be displayed by the directional display device so as to be visible from a position indicated by the position information on the second user. | 02-27-2014 |
20140055350 | ENHANCED DETECTION OF GESTURE - The enhanced detection of a waving engagement gesture, in which a shape is defined within motion data, the motion data is sampled at points that are aligned with the defined shape, and, based on the sampled motion data, positions of a moving object along the defined shape are determined over time. It is determined whether the moving object is performing a gesture based on a pattern exhibited by the determined positions, and an application is controlled if determining that the moving object is performing the gesture. | 02-27-2014 |
20140055351 | Rolling Gesture Detection Using an Electronic Device - An electronic device with one or more processors and memory detects a button press of a respective button of a plurality of buttons that include a first button that corresponds to a first type of operation and a second button that corresponds to a second type of operation. The device determines, in conjunction with detecting the button press, a rolling gesture metric corresponding to performance of a rolling gesture comprising rotation about a longitudinal axis of the electronic device. After determining the rolling gesture metric, when the respective button is the first button, the device initiates performance, in a respective user interface, of an operation of the first type in accordance with the rolling gesture metric and when the respective button is the second button, the device initiates performance, in the respective user interface, of an operation of the second type in accordance with the rolling gesture metric. | 02-27-2014 |
20140055352 | Wireless wrist computing and control device and method for 3D imaging, mapping, networking and interfacing - An apparatus and method for light and optical depth mapping, 3D imaging, modeling, networking, and interfacing on an autonomous, intelligent, wearable wireless wrist computing, display and control system for onboard and remote device and graphic user interface control. Embodiments of the invention enable augmentation of people, objects, devices and spaces into a virtual environment and augmentation of virtual objects and interfaces into the physical world through its wireless multimedia streaming and multi-interface display and projection systems. | 02-27-2014 |
20140055353 | HEAD-MOUNTED DISPLAY - A mounting section ( | 02-27-2014 |
20140062851 | METHODS AND APPARATUS FOR DOCUMENTING A PROCEDURE - Methods and apparatus are disclosed for documenting a procedure in a touchless environment. Example methods disclosed herein include identifying a position of one or more of a user's fingertips using a camera of a mobile device and an entropy-based segmentation of an image input, determining a correspondence between user fingertip position and a control interface of the mobile device, including a plurality of functions to be executed based on user input, triggering the mobile device to execute one or more of the plurality of functions based on the user fingertip position during a procedure, and documenting one or more events of the procedure based on the user input for generating a report to be stored in a storage device of the mobile device. | 03-06-2014 |
20140062852 | Proximity-Based Image Rendering - In one embodiment, a method includes adjusting the images or text rendered on a display based on the position of the viewers relative to the display. | 03-06-2014 |
20140062853 | DELAY OF DISPLAY EVENT BASED ON USER GAZE - Methods and systems of delaying the execution of a display event based on a detected user gaze are provided. Display events may be generated and executed to change a user interface of a display. For example, an autocorrect algorithm can automatically replace a typed word with a corrected word in a text field, generating a display event that causes the corrected word to be displayed instead of the typed word. Such a display event may be executed as soon as possible after its generation. However, a gaze detection device can obtain information that indicates a user is not looking at the typed word on the display. In such a situation, it may be more intuitive to delay the execution of the display event until the gaze information indicates that the user is looking at the typed word. | 03-06-2014 |
20140062854 | HEAD MOUNTED DISPLAY AND METHOD OF CONTROLLING DIGITAL DEVICE USING THE SAME - Disclosed is a method of receiving a gesture input of a user using a Head Mounted Display (HMD) and synthetically controlling a digital device using the received gesture input. The method includes detecting whether or not the HMD is worn by a user, detecting a positional state of an external digital device linked with the HMD, the positional state including a first state in which the external digital device is located in a preset view angle region of the HMD and a second state in which the external digital device is not located in the view angle region, detecting a gesture input of the user, determining at least one digital device to be controlled based on whether or not the HMD is worn by the user and the detected positional state, and controlling a display object of the determined digital device under application of the gesture input. | 03-06-2014 |
20140062855 | FITNESS APPARATUS CAPABLE OF PROJECTION - A fitness apparatus includes a framework, a control panel, and a microprojector. The framework includes a chassis and a handrail stand mounted to the chassis. The control panel is mounted to the handrail stand for displaying a variety of physical states in the process of the user's exercise. The microprojector is electrically connected with the control panel for projecting built-in information of the control panel or external information received by the control panel in such a way that the user can conveniently view the information and enjoy more fun. | 03-06-2014 |
20140062856 | FOLDABLE DISPLAY AND IMAGE PROCESSING METHOD THEREOF - A foldable display and an image processing method thereof are disclosed. The foldable display comprises a display module, a memory, a sensor module, and a processing unit. The sensor module senses a bending state of the display module. The processing unit generates adjusted images according to an image signal, and stores those adjusted images to a plurality of memory addresses of the memory. The processing unit selects a reading address from those memory addresses according the bending state. The processing unit selects a corresponding adjusted image from the memory according to the reading address and outputs the corresponding adjusted image to the display module. | 03-06-2014 |
20140062857 | SMART SIGNAGE SYSTEM - The present patent application is directed to a smart signage system. In one aspect, the smart signage system includes at least a signage device; at least a signage display being connected to the at least one signage device and configured to display contents stored in the signage device; and a plurality of client devices being in wireless communication with the at least one signage device. Each client device is configured to recognize a gesture from a user in correspondence with a target content displayed on the signage display, and thereby to acquire a file of information associated with the target content from the signage device. | 03-06-2014 |
20140062858 | INFORMATION SYSTEM - An information system includes a camera located at a central position in a horizontal direction in front of two users who face the camera and who are aligned side by side, a gesture recognizing unit configured to recognize a gesture of a hand of a user based on a target image captured by the camera, a determining unit that determines whether a central point in a width direction of an arm in the target image is at a right side or at a left side with respect to a center of the hand in the target image, and an operation user configured to determine that a performer of the gesture is a right side or a left side user | 03-06-2014 |
20140062859 | SENSING UNIT, FLEXIBLE DEVICE, AND DISPLAY DEVICE - A sensing unit measuring a bending degree of a flexible substrate includes: a first line formed on the flexible substrate; a second line adjacent to the first line; and a first controller applying a first sensing signal to the first line and measuring a change of crosstalk generated in the second line by the first sensing signal according to bending of the flexible substrate. | 03-06-2014 |
20140062860 | SMART SCREEN ROTATION BASED ON USER ORIENTATION - A method for performing smart screen rotation and an electronic device thereof are provided. A method for controlling screen rotation in an electronic device includes determining a direction of the electronic device, and determining a direction of a user, and determining whether to rotate a screen by comparing the direction of the electronic device with the direction of the user. | 03-06-2014 |
20140062861 | GESTURE RECOGNITION APPARATUS, CONTROL METHOD THEREOF, DISPLAY INSTRUMENT, AND COMPUTER READABLE MEDIUM - A gesture recognition apparatus for recognizing a gesture of a hand of a user from a moving image in which action of the hand of the user is photographed is provided, the gesture recognition apparatus comprising: a face detector configured to detect a face of the user; a shape identification part configured to identify whether the hand is a right hand or a left hand; and a performer specification part configured to specify a person, who is closest to the identified hand and whose face is located on a right side of the identified hand, as a performer of the gesture when the identified hand is the right hand, and specify a person, who is closest to the identified hand and whose face is located on a left side of the identified hand, as the performer of the gesture when the identified hand is the left hand. | 03-06-2014 |
20140062862 | GESTURE RECOGNITION APPARATUS, CONTROL METHOD THEREOF, DISPLAY INSTRUMENT, AND COMPUTER READABLE MEDIUM - A gesture recognition apparatus for recognizing a gesture of a user from a moving image in which the user is photographed is provided, the gesture recognition apparatus comprising: a sight line direction estimation part configured to estimate a sight line direction of the user; a determination part configured to determine that the user intends to start the gesture when an angle formed by a first predetermined direction and the sight line direction is less than a predetermined value in a predetermined period; and a notification part configured to notify the user that the determination is made, when the determination part determines that the user intends to start the gesture. | 03-06-2014 |
20140062863 | METHOD AND APPARATUS FOR SETTING ELECTRONIC BLACKBOARD SYSTEM - Provided are a method and apparatus for setting an electronic blackboard system. In response to a user input to a control apparatus requesting setting of the electronic blackboard, sensitivity of an IR camera is set so that visible rays are detected. A projector projects an image with a presentation region and a first guider therein for alignment. The IR camera transmits a first captured image of the presentation region, including at least a portion of the first guider, to the control apparatus. The projector is then controlled to project to the screen at least a portion of a second guider corresponding to the first guider in the first image received from the IR camera. The user may then make positional adjustments to the IR camera or projector using the second guider. | 03-06-2014 |
20140062864 | METHOD AND APPARATUS FOR EXTRACTING THREE-DIMENSIONAL DISTANCE INFORMATION FROM RECOGNITION TARGET - A method and apparatus for extracting three-dimensional distance information from a recognition target is provided, which enables a gesture input from a user to be correctly recognized using distance information from the recognition target, and at the same time makes it possible to efficiently save power required for detection of the gesture input. The method includes determining if a recognition target exists within a predetermined range; when the recognition target exists within the predetermined range, generating a 3D image for the recognition target; and calculating a distance to the recognition target by using the 3D image. | 03-06-2014 |
20140062865 | METHOD AND APPARATUS FOR SELECTIVELY PRESENTING CONTENT - A machine-implemented method includes obtaining input data and generating output data. The status of at least one contextual factor is determined and compared with a standard. If the status meets the standard, a transformation is applied to the output data. The output data is then outputted to the viewer. Through design and/or selection of contextual factors, standards, and transformations, output data may be selectively outputted to viewers in a context-suitable fashion, e.g. on a head mounted display the viewer's central vision may be left unobstructed while the viewer walks, drives, etc. An apparatus includes at least one sensor that senses a contextual factor. A processor determines the status of the contextual factor, determines if the status meets a standard, generates output data, and applies a transformation to the output data if the status meets the standard. A display outputs the output data to the viewer. | 03-06-2014 |
20140062866 | GESTURE RECOGNITION APPARATUS, CONTROL METHOD THEREOF, DISPLAY INSTRUMENT, AND COMPUTER READABLE MEDIUM - A gesture recognition apparatus for recognizing a gesture of a user from a moving image in which the user is photographed is provided, the gesture recognition apparatus comprising: a determination part configured to determine a type of the gesture; and a recognition area definition part configured to define a recognition area, which is an area where the gesture is recognized in a whole area of the moving image, based on the type of the gesture determined by the determination part. | 03-06-2014 |
20140062867 | Electrode Blinking Device - An eyelid movement detector and communication system includes a first eyelid movement sensor for placement near a first eyelid of a first eye, and a second eyelid movement sensor for placement near a second eyelid of a second eye of a subject, wherein the first and second eyelid movement sensors detect movement of the first and second eyelids, respectively, and produce a first eyelid movement signal and a second eyelid movement signal in response to the movement detected. The system further includes a reference sensor located on the subject, and an amplifier for amplifying the eyelid movement signals of the first and second eyelids, wherein the signals are directed to a computer wherein the signals are processed and as a result, a function of the system may be activated. | 03-06-2014 |
20140062868 | VISUAL DISPLAY WITH ILLUMINATORS FOR GAZE TRACKING - A visual display includes hidden reference illuminators adapted to emit invisible light for generating corneo-scleral reflections on an eye watching a screen surface of the display. The tracking of such reflections and the pupil center provides input to gaze tracking. A method for equipping and an LCD with a reference illuminator are also provided. Also provides are a system and method for determining a gaze point of an eye watching a visual display that includes reference illuminators. The determination of the gaze point may be based on an ellipsoidal cornea model. | 03-06-2014 |
20140062869 | SINGLE USER INPUT MECHANISM FOR CONTROLLING ELECTRONIC DEVICE OPERATIONS - A unique input mechanism for controlling several operations of an electronic device is provided. Using the unique input mechanism, which may be the single input mechanism for providing user inputs to the electronic device, a user may provide different inputs or combinations of inputs to control different operations based on the current mode or capacity of the electronic device. For example, a single, short click of a button may control a media operation (e.g., play/pause) in a media mode, and the same input may control a telephony operation (e.g., initiate/terminate call) in a telephony mode. In some embodiments, different inputs may be associated with different types of operations. The unique input mechanism may include, for example, a button, a switch, a key, or an actuator. | 03-06-2014 |
20140062870 | ROTATING AN OBJECT ON A SCREEN - An apparatus ( | 03-06-2014 |
20140071036 | INVOKING A USER ENVIRONMENT BASED ON DEVICE COVER - A device cover includes a device cover identifier. A computing device includes a device cover identifier reader for reading the device cover identifier, and the computing device invokes a user environment associated with the device cover identifier. The user environment can include an arrangement of home screen program icons, executing application programs, computing device notification settings, device or application usernames and passwords, brightness levels, or display screen wallpaper. The device cover identifier can include one or more magnets, an RFID tag, color, intensity, or polarity of light passing through the cover, or textual or graphical information on the cover. | 03-13-2014 |
20140071037 | BEHAVIOR RECOGNITION SYSTEM - A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined. | 03-13-2014 |
20140071038 | METHOD FOR GENERATING MOVEMENT POSITION COORDINATE AND HUMAN-MACHINE INTERFACE INPUT SYSTEM USING THE SAME - A method for generating a movement position coordinate adapted to be used in a human-machine interface input controller is provided. The method includes steps of: sequentially generating a plurality of position coordinates by detecting a movement of an object; reading and storing the position coordinates; performing, when the number of the stored position coordinates reaches to a predetermined value, an operation on the position coordinates to obtain a movement position coordinate; and modulating the predetermined value in accordance with a change of a movement rate of the object. | 03-13-2014 |
20140071039 | Electronic Apparatus and Display Control Method - According to one embodiment, an electronic apparatus includes at least one sensor, a display processor, and a determiner. The display processor displays a first image corresponding to first content data stored in a first folder on a screen. The determiner determines whether the electronic apparatus moves, using the at least one sensor. If it is determined that the electronic apparatus moves during display of the first image, the display processor displays, on the screen, an second image corresponding to second content data stored in another folder different from the first folder. | 03-13-2014 |
20140071040 | System and method for planning or organizing items in a list using a device that supports handwritten input - A method for generating a list of images associated with items for planning or organizing the items on a device configured for receiving handwritten input is provided. The method includes (i) processing a first handwritten input including a first content associated with a first item, (ii) generating a first image that includes the first content associated with the first item, (iii) processing a second handwritten input including a second content associated with a second item, (iv) generating a second image that includes the second content associated with the second item, (v) generating a list that includes the first image, and the second image, and (vi) displaying the list that includes the first image and the second image. The first image and the second image are stored in a database. | 03-13-2014 |
20140071041 | HEAD-MOUNTED DISPLAY DEVICE, CONTROL METHOD FOR THE HEAD-MOUNTED DISPLAY DEVICE, AND AUTHENTICATION SYSTEM - A head-mounted display device that enables a user to simultaneously visually recognize a virtual image and an outside scene includes a photographing unit configured to photograph at least a part of a visual field direction of the user in a state in which the user wears the head-mounted display device and acquire a motion of the user; a track acquiring unit configured to acquire a track of the motion of the user from the motion photographed by the photographing unit; and an authentication processing unit configured to authenticate, using the track acquired by the track acquiring unit, whether the user is a proper user. | 03-13-2014 |
20140071042 | COMPUTER VISION BASED CONTROL OF A DEVICE USING MACHINE LEARNING - A method for computer vision based control of a device, the method comprising: obtaining a first frame comprising an image of an object within a field of view; identifying the object as a hand by applying computer vision algorithms; storing image related information of the identified hand; obtaining a second frame comprising an image of an object within a field of view and identifying the object in the second frame as a hand by using the stored information of the identified hand; and controlling the device based on the hand identified in the first and second frames. | 03-13-2014 |
20140071043 | METHOD OF EXECUTING APPLICATION, METHOD OF CONTROLLING CONTENT SHARING, AND DISPLAY DEVICE - Provided is a method of executing an application performed by a display device. The method includes detecting a first motion that bends the display device in a first direction to deform the display device into a first form, executing a first application, based on the first motion, detecting a second motion that bends the display device in a second direction to deform the display device into a second form, and executing a second application related to the first application, based on the second motion. | 03-13-2014 |
20140071044 | DEVICE AND METHOD FOR USER INTERFACING, AND TERMINAL USING THE SAME - There are provided a device and method for user interfacing, and a terminal using the same. The user interfacing method includes setting a reference image of an object to be used for user interfacing, recognizing the object to be used for user interfacing from input user-related images, determining depth-related movement of the object by comparing the recognized object and the reference image, and operating an application according to the depth-related movement of the object. Therefore, it is possible to control the terminal using user movement with respect to a distance between the terminal used by the user and the user. | 03-13-2014 |
20140071045 | SYSTEM AND METHOD FOR PRODUCING EDITED IMAGES USING EMBEDDED PLUG-IN - Disclosed is a cross-platform image editor configured with image editing tools and that is integrated with a software application and operable on a user computing device. A low-res version of a high-res image stored in an image library is displayed, and a user selection of at least one image editing tool and at least one associated parameter is received. The low-res version is modified accordingly. Without human user intervention the high resolution image is processed by executing at least one instruction respectively associated with each of the at least one selected image editing tools using the at least one parameter. The modified high resolution image is output. | 03-13-2014 |
20140071046 | SYSTEMS AND METHODS FOR PROCESSING MOTION SENSOR GENERATED DATA - Systems and methods for processing data from a motion sensor to detect intentional movements of a device are provided. An electronic device having a motion sensor may process motion sensor data along one or more dimensions to generate an acceleration value representative of the movement of the electronic device. The electronic device may then determine whether the acceleration value changes from less than a low threshold, to more than a high threshold, and again to less than the low threshold within a particular amount of time, reflecting an intentional movement of the electronic device by the user. In response to determining that the acceleration value is associated with an intentional movement of the electronic device, the electronic device may perform a particular event or operation. For example, in response to detecting that an electronic device has been shaken, the electronic device may shuffle a media playlist. | 03-13-2014 |
20140071047 | PORTABLE DEVICE AND METHOD FOR CONTROLLING THE SAME - A method for controlling a portable device is provided. The method includes detecting bending of the portable device and determining whether to perform motion sensing correction due to the bending; acquiring a motion sensing correction factor for performing the motion sensing correction due to the bending; performing motion sensing correction of at least one motion sensor using the motion sensing correction factor; and controlling the portable device according to the corrected motion sensing. | 03-13-2014 |
20140078037 | SYSTEM AND METHOD OF DEVICE MANAGEMENT ON EXTENSIBLE AND CONFIGURABLE DETECTION OF ELECTRONIC DEVICE INTERACTIONS - A system and method determines a management action as a function of device interactions. The method includes determining a first interaction data generated by an electronic device, the first interaction data being indicative of at least one of a device user interaction, a device environment interaction, and a device state interaction. The method includes generating, by the electronic device, a first notification data as a function of the first interaction data. The method includes transmitting the first notification data from the electronic device to a remote device management server. The method includes determining, by the device management server, a first management action data as a function of the first notification data, the first management action data being indicative of a functionality to be applied to the electronic device. | 03-20-2014 |
20140078038 | SYSTEMS AND METHODS FOR PROVIDING ACCESSORY DISPLAYS FOR ELECTRONIC DEVICES - Systems and methods herein provide for an accessorizeable display of features for a processing device, such as a smart phone, a tablet computer, a laptop computer, or other processing devices. The display may be configured as a case on the “backside” or cover of the processing device that interfaces through a communication port of the processing device. In one embodiment, a system includes a display data module operable on the processing device to provide a graphical user interface via a first display device to a user of the processing device, and to interface with a second display device coupled to the processing device to display an image on the second display device. The system also includes a remote data center operable to retrieve the image and to communicate with the display data module through a network to provide the image to the second display device via the display data module. | 03-20-2014 |
20140078039 | SYSTEMS AND METHODS FOR RECAPTURING ATTENTION OF THE USER WHEN CONTENT MEETING A CRITERION IS BEING PRESENTED - Systems and methods for recapturing attention of the user when content meeting a criterion is being presented are provided. Content is caused to be presented on a display of a user equipment device. A first state of a setting of the user equipment device is stored. A determination is made as to whether the content being presented meets a criterion. A facial feature of the user is processed to determine whether the user is gazing towards the display when the content being presented meets the criterion. In response to determining the user is not gazing towards the display when the content being presented meets the criterion, the first state of the setting of the user equipment device is adjusted to a second state to alert the user about the content. | 03-20-2014 |
20140078040 | DUAL-MODE REMOTE CONTROL METHOD - A dual-mode remote control method, adapted to an electronic apparatus having an image sensor, is provided. In the method, the image sensor is used to successively capture a plurality of images. Then, infrared ray (IR) detection and/or gesture detection is determined according to at least one characteristic of the images and performed on the images, so as to obtain an action of at least one target in the images. Finally, a control command corresponding to the action is executed. | 03-20-2014 |
20140078041 | TRANSMITTANCE BASED SENSOR - A device may include an image sensor configured to capture an image of an object in front of the image sensor, an image analyzer configured to analyze the captured image to calculate a transmittance of a light that is transmitted from outside of the object to the image sensor via the object, a command mapper configured to translate the transmittance into a command, and an executor configured to execute the command. | 03-20-2014 |
20140078042 | ELECTRONIC DEVICE AND THE CONTROLLING METHOD THEREOF - The present invention discloses an electronic device and the controlling method thereof. More specifically, the device and method of the present invention are capable of providing a pre-determined function in a pre-determined situation by inputting and comparing an accelerating data generated by an acceleration sensor. Thus, the present invention allows the user to define a three-dimensional action path as the trigger requirement of a pre-determined function. Accordingly, the user is capable of applying the electronic device with more flexibility and less limitation thereof. | 03-20-2014 |
20140078043 | APPARATUS AND METHOD OF PROVIDING USER INTERFACE ON HEAD MOUNTED DISPLAY AND HEAD MOUNTED DISPLAY THEREOF - An apparatus and method of providing a user interface (UI) on head mounted display and the head mounted display (HMD) thereof are discussed. The apparatus comprises a sensor unit detecting whether an object exists in the proximity of the HMD and if the object is detected, the sensor unit senses a distance between the object and the HMD. The apparatus further comprises a processor controlling a User Interface (UI) of the HMD based on a result of the sensor unit. A physical User Interface (UI) mode is applied if the detected object is within a predetermined distance from the HMD and a non-physical User Interface (UI) mode is applied if the object is not detected or is not within the predetermined distance from the HMD. | 03-20-2014 |
20140078044 | INPUT DEVICE - An input device includes an operation panel that is installed inside a vehicle and is operated by an operating part; a CCD camera that is disposed inside the vehicle to image at least a front side of the operation panel; and a control unit that predicts an action of the operating part based on image information of the CCD camera and performs operation assistance on the operation panel. | 03-20-2014 |
20140078045 | Display Apparatus And Terminal - The embodiments of the present invention provide a display apparatus and a terminal. The display apparatus comprises a display panel ( | 03-20-2014 |
20140078046 | FLEXIBLE DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A flexible display apparatus is provided. The flexible display apparatus includes: a display that is bendable, a sensor configured to sense deformation of the display, and a controller configured to perform an operation corresponding to the sensed shape deformation in response to the sensed deformation being shape deformation in which the display is alternately bent in opposing directions within a predetermined time. | 03-20-2014 |
20140078047 | FLEXIBLE DISPLAY APPARATUS AND DISPLAY METHOD THEREOF - A flexible display apparatus configured to sense deformation of the flexible display apparatus, control display of an object displayed on the flexible display apparatus based on the deformation, and execute operations based on the displayed object. | 03-20-2014 |
20140078048 | METHOD OF RECOGNIZING CONTACTLESS USER INTERFACE MOTION AND SYSTEM THERE-OF - A contactless user-interface (UI) motion recognizing device and method of controlling the same are provided. The method includes: obtaining a left-eye image and a right-eye image; determining an object position of an object in the obtained left-eye image and the obtained right-eye image; determining an object brightness of the object; determining depth information of the object using the determined object brightness; determining a three-dimensional (3D) object position of the object using the determined object position and the determined depth information; determining an object moving velocity based on the determined 3D object position and a previous 3D object position; and determining a UI pattern based on the determined 3D object position and the determined object moving velocity, and executing an operation according to the determined UI pattern. | 03-20-2014 |
20140078049 | MULTIPURPOSE CONTROLLERS AND METHODS - Method and apparatus is disclosed for a user to communicate with an electronic device. A processor receives user intention actions comprising facial expression (FE) information indicative of facial expressions and body information indicative of motion or position of one or more body parts of the user. When the FE or body information crosses a first level, the processor starts generating first signals based on the FE or body information to communicate with the electronic device. When the FE or body information crosses a second level, the processor can end generation of the first signals or modify the first signals. An image processing or eye gaze tracking system can provide some FE information or body information. The signals can modify attributes of an object of interest. | 03-20-2014 |
20140078050 | METHOD AND SYSTEM FOR TRIGGERING AND CONTROLLING HUMAN-COMPUTER INTERACTION OPERATING INSTRUCTIONS - A method and system for triggering and controlling human-computer interaction operating instructions are described. The method pre-stores a mapping relationship between a display mode of light source in photograph image frame and a human-machine interaction instruction. While triggering and controlling the system, the method includes the steps of: acquiring the photograph image frame generated by a photographing device; detecting light source in photograph image frame for analyzing display mode of the detected light source in photograph image frame; determining human-computer interaction instruction corresponding to display mode of the detected light source in photograph image frame for triggering human-computer interaction instruction based on the pre-stored mapping relationship. The system comprises a mapping relationship database, a photograph-acquiring module, a light source detecting module, an analyzing module and an instruction-identifying module. The present invention allows the user to implement the non-contact type of human-computer interactions with the high precision. | 03-20-2014 |
20140078051 | IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD - According to this invention, image processes are classified into a plurality of categories, and the result of processes belonging to each category is written in an intermediate buffer prepared for the category. The order of categories is determined in advance, and an output from a process of a preceding category serves as an input to a process of a succeeding category. The preceding category includes image processes such as automatic correction requiring no quick response. The succeeding category includes image processes such as manual correction requiring a quick response. | 03-20-2014 |
20140078052 | Detecting User Input Provided to a Projected User Interface - Apparatuses and methods are provided for determining whether a user intends to provide an input using an image of a control appearing on a surface. An apparatus may include first and second cameras to capture set and second sets of two or more images of the surface, and a unit to determine whether various conditions are true. A first condition is that a particular number of skin-color pixels are present within one cell of the two or more images. A cell location substantially coincides with the image of the control. A second condition is that the skin-color pixels persist for at least a particular time period. A third condition is that a region of skin-color pixels has a substantially rectangular shape. A fourth condition is that corresponding edges of skin-color pixels in the first and second sets of images are within a predefined offset distance. Additional conditions are disclosed. | 03-20-2014 |
20140078053 | CONTROLLER DEVICE, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD - A controller device is used for more information processes. The controller device is capable of wirelessly communicating with an information processing device. The controller device includes a communication unit, a display unit, and a program executing unit. The communication unit transmits, to the information processing device, operation data obtained based on an operation performed on the controller device, and receives, from the information processing device, image data generated in the information processing device through a process performed based on operation data. A display unit displays an image represented by the image data received from the information processing device. When a predetermined operation on the controller device is performed, the program executing unit executes a predetermined program and displays at least the image resulting from the execution of the program on the display unit. The communication unit transmits, to the information processing device, data indicating that the predetermined program is being executed. | 03-20-2014 |
20140085177 | METHOD AND APPARATUS FOR RESPONDING TO INPUT BASED UPON RELATIVE FINGER POSITION - A method, apparatus and computer program product are provided to facilitate user input based upon the relative position of their fingers. In the context of a method, sensor information is received that is indicative of the position of a first finger relative to a second finger. The first finger may be a thumb such that the sensor information is indicative of the position of the thumb relative to the second finger. In conjunction with the receipt of sensor information, the method receives sensor information from a sensor that is offset from an interface between the first and second fingers so as not to be positioned between the first and second fingers. The method also determines, with a processor, the relative position of the first and second fingers based upon the sensor information and causes performance of an operation in response to the relative position of the first and second fingers. | 03-27-2014 |
20140085178 | METHOD AND APPARATUS FOR CONTROLLING INFORMATION DISPLAY AREAS IN A MIRROR DISPLAY - A mirror display with controllable information display areas. The mirror display includes a reflective viewer side to function as a mirror and an LCD device located on a non-viewer side to provide information display areas, which are viewable through the mirror display on the viewer side. A processor is electronically connected with the LCD device for controlling a position, such as the location and size, of the information display areas on the viewer side of the mirror. The position of the information display areas can be controlled via user input or a signal received from a sensor associated with the mirror. | 03-27-2014 |
20140085179 | DEVICES, METHODS, AND SYSTEMS FOR PROVIDING INTERACTIVITY WITH DIGITAL SIGNS - A device, method, and system for providing interactivity with a digital sign includes an interactive digital sign configured to display information in response to interactions by a viewer. The viewer may interact with the digital sign using sensors of the digital sign or via use of a mobile communication device carried by the viewer. User data may be used by the interactive digital sign to select the information, which may include advertisements, coupons, directions, and other information. The information may be transmitted to the viewer's mobile communication device. | 03-27-2014 |
20140085180 | SELECTING AN INTERACTION SCENARIO BASED ON AN OBJECT - Techniques for selecting an interaction scenario based on an object are described in various implementations. A method that implements the techniques may include receiving, at a computer system and from an image capture device, an image that depicts a viewing area proximate to a presentation device. The method may also include processing the image, using the computer system, to detect a user in the viewing area presenting an object in a manner that indicates desired interaction with the presentation device. The method may also include selecting, using the computer system, an interaction scenario for presentation on the presentation device based on the object. | 03-27-2014 |
20140085181 | MOOD-ACTUATED DEVICE - This document describes techniques and apparatuses for implementing a mood-actuated device. In various embodiments, mood information corresponding to a current mood of a user is received. An emotional state of the user is determined based on the mood information, and a mood-actuated device is controlled to react based on the emotional state of the user. In some embodiments, the mood-actuated device includes a flexible material that is configured to react by changing to a shape based on the emotional state of the user. | 03-27-2014 |
20140085182 | AUTOSTEREOSCOPIC DISPLAY SYSTEM AND CONTROL METHOD THEREOF - At least one characteristic of an object is captured at a first instant and the at least one characteristic of the object is then captured at a second instant. A moving direction and a moving speed of the object are calculated according to the at least one characteristic of the object captured respectively at the first instant and the second instant. A left view image and a right view image are projected to the object and if the moving speed of the object is greater than a threshold, a center point of the left view image and the right view image deviates from a center line of the object. | 03-27-2014 |
20140085183 | HEAD-MOUNTED DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A head-mounted display apparatus and control method are provided. The apparatus includes a display unit, a signal processing unit processing an image signal to display an image on the display unit, a shutter unit performing a shutter operation to allow selective transmission of external light for both eyes of a user, a camera detecting and taking an image of an external environment of the head-mounted display apparatus, and a controller maintaining transmission of the shutter unit transmitting external light at preset first transmittance when the image is displayed on the display unit, and adjusting the transmission of the shutter unit to second transmittance higher than the first transmittance if the camera detects an external device while the image is displayed on the display unit. | 03-27-2014 |
20140085184 | Interaction Techniques for Flexible Displays - The invention relates to a set of interaction techniques for obtaining input to a computer system based on methods and apparatus for detecting properties of the shape, location and orientation of flexible display surfaces, as determined through manual or gestural interactions of a user with said display surfaces. Such input may be used to alter graphical content and functionality displayed on said surfaces or some other display or computing system. | 03-27-2014 |
20140085185 | MEDICAL IMAGE VIEWING AND MANIPULATION CONTACTLESS GESTURE-RESPONSIVE SYSTEM AND METHOD - Viewing and manipulation systems and methods for medical images shown on a display. The method includes the steps of observing a multiple-person medical environment using a camera having a field of view, and sending field-of-view data of the multiple-person medical environment from the camera to a processor. The processor performs the steps of (i) analyzing the field-of-view data to identify a target practitioner and define a target practitioner-based, non-uniform coordinate frame connected to the target practitioner; (ii) monitoring a time-series of the field-of-view data to identify at least one input communicated by a gesture performed by the target practitioner in the target practitioner-based, non-uniform coordinate frame; and (iii) manipulating a medical image shown by the display in response to identifying the at least one input. | 03-27-2014 |
20140085186 | METHOD FOR CONTROLLING MOBILE APPLICATIONS - A method for controlling mobile applications by using a positioning system. In the method the allowed directions of movement are limited to predetermined directions. The method is configured to select one of the predetermined directions based on a measured direction and then a measured speed of movement is adapted to the selected direction and the mobile application is controlled accordingly. | 03-27-2014 |
20140085187 | DISPLAY APPARATUS AND CONTROL METHOD THEREOF - Disclosed are a display apparatus and a control method thereof. The display apparatus includes: an image processor which processes an image of a content including a plurality of scenes in order to display the image; a display which displays an image of the content thereon; a voice input which inputs a user's voice; and a controller which displays a first scene of the plurality of scenes of the content, and displays a second scene falling under a next scene of the first scene, out of the plurality of scenes of the content, in response to a determination that the user's voice, which has been input while the first scene is displayed, corresponds to the first scene. | 03-27-2014 |
20140085188 | APPARATUS AND METHOD FOR PROCESSING SPLIT VIEW IN PORTABLE DEVICE - An apparatus and a method for processing a split view in a portable device. The method of processing a split view in a portable device, includes: displaying a plurality of applications as a split view corresponding to split display regions of a display unit; displaying an input panel on a display region of an application of the plurality of applications that did not call for the input panel when the input panel is called; and processing data input through the input panel by an application calling the input panel. | 03-27-2014 |
20140085189 | LINE-OF-SIGHT DETECTION APPARATUS, LINE-OF-SIGHT DETECTION METHOD, AND PROGRAM THEREFOR - A line-of-sight detection apparatus that specifies a point-of-regard of a subject within a object includes: a photographing unit outputting a photographed image of the subject and a zoom value; a cornea determination unit discriminating a cornea image of the subject from the image; a reference point specifying unit specifying an eyeball center of the subject based on the cornea image and specifying a reference point based on the eyeball center; a distance measurement unit specifying a zoom value indicating a predetermined size of the cornea image and specifying a distance from the cornea to the object based on the zoom value; a line-of-sight movement amount specifying unit specifying a cornea movement amount based on a cornea image movement amount and specifying a line-of-sight movement amount based on the cornea movement amount; and a point-of-regard specifying unit specifying the point-of-regard based on the reference point and line-of-sight movement amount. | 03-27-2014 |
20140085190 | Display, Imaging System and Controller for Eyewear Display Device - Several embodiments of a personal display system are disclosed that comprises modular and extensible features to affect a range of user/wearer/viewer experiences. In one embodiment, the personal display system comprises a frame; a processor capable of sending image data signals and control signals; a display; an optic system, said optic system optically coupled to said at least one display; and a set of actuators, said actuators coupled to said frame and in communication with the optic system, such that said set of actuators are capable of moving the optic system, according to control signals sent from said processor. In another embodiment, a method for pre-warping input image data under processor control and according to ambient conditions, such as light and temperature, is disclosed. | 03-27-2014 |
20140085191 | PERSONAL COMPUTING DEVICE CONTROL USING FACE DETECTION AND RECOGNITION - Systems and methods are provided for control of a personal computing device based on user face detection and recognition techniques. | 03-27-2014 |
20140085192 | IMAGE PROJECTION AND CONTROL APPARATUS AND METHODS - A projected still or video image is controlled with light generated by a laser pointer, for example. A device with a projector and a display controller projects an image onto a surface, and an image sensor views the projected image. A handheld device outputs a beam of light. A controller receives information from the image sensor regarding the presence or movement of light from the handheld device interacting with the projected image on the surface, and a control function is implemented if the light from the handheld device interacts with the projected image in accordance with a stored predetermined visual interactions such as encircling, “scratching,” or other movements. The controller may be programmed to recognize light of a specific laser wavelength in conjunction with a control operation. | 03-27-2014 |
20140085193 | PROTOCOL AND FORMAT FOR COMMUNICATING AN IMAGE FROM A CAMERA TO A COMPUTING ENVIRONMENT - A media feed interface may be provided that may be used to extract a media frame from a media feed. The media feed interface may access a capture device, a file, and/or a network resource. Upon accessing the capture device, file, and/or network resource, the media feed interface may populate buffers with data and then may create a media feed from the buffers. Upon request, the media feed interface may isolate a media frame within the media feed. For example, the media feed interface analyze media frames in the media feed to determine whether a media frame includes information associated with, for example, the request. If the media frame includes the requested information, the media feed interface may isolate the media frame associated with the information and may provide access to the isolated media frame. | 03-27-2014 |
20140085194 | IMAGE-BASED OBJECT TRACKING SYSTEM IN 3D SPACE USING CONTROLLER HAVING MULTIPLE COLOR CLUSTERS - Image-based object tracking system includes at least a controller with two or more color clusters, an input button, a processing unit with a camera, an object tracking algorithm and a display. Camera is configured to capture images of the controller, the processing unit is connected to display to display processed image contents, the controller is directly interacting with displayed processed image content. The controller can have two or three color clusters located on a side surface thereof and two color clusters having concentric circular areas located at a top surface thereof, the color of the first color cluster can be the same as or different from the color of the third color cluster. An object tracking method with or without scale calibration is also provided, which includes color learning and color relearning, image capturing, separating and splitting of the controller and the background, object pairing procedure steps on the controller. | 03-27-2014 |
20140085195 | MOBILE TERMINAL AND OPERATION CONTROL METHOD THEREOF - Discussed are a mobile terminal and an operation control method thereof in which a delay time of the screen lock execution is controlled according to the user's gaze information. The mobile terminal according to an embodiment of the present disclosure may include an input unit configured to receive a user input; an execution controller configured to execute screen lock if the user input is not received for a predetermined time T1; and a change controller configured to change the predetermined time T1 based on the user's gaze information. | 03-27-2014 |
20140085196 | Method and System for Secondary Content Distribution - A secondary content distribution system and method is described, the system and method including a receiver for receiving a plurality of differing versions of secondary content from an provider, each one of the differing versions of the secondary content being associated with at least one of a reading mode, and a connection mode, a processor operative to determine a reading mode of a user of a client device, a selector for selecting one of the differing versions of the secondary content for display on the client device display, the selection being a function, at least in part, of matching the determined reading mode with the reading mode associated with the one of the differing versions of the secondary content and the connection mode of the client device, and a display for displaying the selected one of the differing versions of the secondary content on the client device display. Related methods, systems, and apparatus are also described. | 03-27-2014 |
20140092002 | Movement Based Image Transformation - In some implementations, an image can be presented in an image editing interface and the image can be edited based on movement of the mobile device. In some implementations, the mobile device can be configured to provide a plurality of types edits (e.g., filters, effects, etc.) that can be applied to the image, a selection of one or more edit types can be received, movement sensor data can obtained describing movement of the mobile device, and the selected edits can be applied to the image based on the sensor data. | 04-03-2014 |
20140092003 | DIRECT HAPTIC FEEDBACK - An electronic device comprises an input device and logic to register one or more input events and one or more haptic effects associated with the one or more input events for an application on an electronic device, receive an input event, retrieve one or more haptics effects, and pass the one or more haptics effects associated with the input event to a haptics actuator. Other embodiments may be described. | 04-03-2014 |
20140092004 | AUDIO INFORMATION AND/OR CONTROL VIA AN INTERMEDIARY DEVICE - The present disclosure is directed to systems and methods related audio information and/or control via intermediary device. For example, a system may comprise a monitor, a peripheral device and a mobile device. The monitor may be configured to present multimedia information based on remote control information received in the monitor over, for example, a wired or wireless HDMI connection. Remote control information may include commands for controlling operation of the monitor when presenting the multimedia information. The peripheral device may be configured to reproduce sound associated with the multimedia information based on audio information received via, for example, wireless communication such as Bluetooth or WLAN. The mobile device may be configured to provide the remote control information to the monitor (e.g., via the wired or wireless HDMI connection) and to provide the audio information to the peripheral device (e.g., via the Bluetooth or WLAN wireless communication). | 04-03-2014 |
20140092005 | IMPLEMENTATION OF AN AUGMENTED REALITY ELEMENT - Systems and methods may provide for an implementation of an augmented reality element. A logic architecture may be employed to coordinate the implementation of the augmented reality element based on an input and in response to when a user is to view the augmented reality element. The logic architecture may also be employed to perform an association between the input and the implementation of an augmented reality element, wherein the association is to be defined by the user. Additionally, the logic architecture may be employed to include a guide input to guide the implementation, for example a guide input to guide the conduct, of the augmented reality element. | 04-03-2014 |
20140092006 | DEVICE AND METHOD FOR MODIFYING RENDERING BASED ON VIEWER FOCUS AREA FROM EYE TRACKING - Devices and methods for modifying content rendered on the display of a computing device as a function of eye focus area include receiving sensor data from one or more eye tracking sensors, determining an eye focus area on the display screen as a function of the sensor data, and adjusting one or more visual characteristics of the rendered content as a function of the eye focus area. Perceived quality of the rendered content may be improved by improving the visual characteristics of the content displayed within the eye focus area. Rendering efficiency may be improved by degrading the visual characteristics of the content displayed outside of the eye focus area. Adjustable visual characteristics include the level of detail used to render the content, the color saturation or brightness of the content, and rendering effects such as anti-aliasing, shading, anisotropic filtering, focusing, blurring, lighting, and/or shadowing. | 04-03-2014 |
20140092007 | ELECTRONIC DEVICE, SERVER AND CONTROL METHOD THEREOF - Provided are a display apparatus, a control method thereof, a server, and a control method thereof. The display apparatus includes: a processor which processes a signal; a display which displays an image based on the processed signal; a command receiver which receives a voice command; a communicator which communicates with a first server; a storage; and a controller which receives, from the first server, a voice recognition command list comprising a voice recognition command and control command information corresponding to the voice recognition command, and stores the received voice recognition command list in the storage, the voice recognition command being among user's voice commands which have successfully been recognized a predetermined number of times or more, determines whether the voice command corresponds to the voice recognition command included in the voice recognition command list, and if so, controls the processor to operate based on the control command information, and if not, transmits the voice command to the first server, receives corresponding control command information from the first server, and controls the processor to operate based on the received control command information. | 04-03-2014 |
20140092008 | ELECTRONIC DEVICE - An electronic device includes a portable computing device and a peripheral device. The portable computing device includes a control center and a first sensor, and whereas the peripheral device includes a base, a frame body and a lock set. An input module is located in the base and the frame body, having a closed position and an uncovered position with respect to the base, is pivotally connected to the base. The lock set, having a released position and a locked position, is located in the frame body and includes a second sensor. The control center detects a relative location between the first sensor and the second sensor and uses the relative location for controlling the statuses of the input module and the portable computing device. | 04-03-2014 |
20140092009 | Methods and Systems for Dynamic Calibration of Movable Game Controllers - A video gaming system includes a wireless controller that senses linear and angular acceleration to calculate paths of controller movement over a broad range of controller motion. The system also includes an electromagnetic alignment element, such as a set of LEDS. The controller includes an additional sensor to sense light from the LEDs over a relatively restricted range of controller motion, and use this sensed light to dynamically calibrate the controller when the controller passes through the restricted range of motion over which the sensor senses the light. | 04-03-2014 |
20140092010 | IMAGING DISPLAY APPARATUS AND METHOD - An imaging display apparatus, includes: display means for image display; first image signal generation means for generating a display image signal based on a captured image signal captured by an imaging section with a field of view direction of a user being a direction of an object; second image signal generation means for generating a display image signal of an image different from an image of the display image signal generated by the first image signal generation means; and control means for allowing, simultaneously on the display means, display of the image of the display image signal generated by the first image signal generation means and display of the image of the display image signal generated by the second image signal generation means. | 04-03-2014 |
20140098018 | WEARABLE SENSOR FOR TRACKING ARTICULATED BODY-PARTS - A wearable sensor for tracking articulated body parts is described such as a wrist-worn device which enables 3D tracking of fingers and optionally also the arm and hand without the need to wear a glove or markers on the hand. In an embodiment a camera captures images of an articulated part of a body of a wearer of the device and an articulated model of the body part is tracked in real time to enable gesture-based control of a separate computing device such as a smart phone, laptop computer or other computing device. In examples the device has a structured illumination source and a diffuse illumination source for illuminating the articulated body part. In some examples an inertial measurement unit is also included in the sensor to enable tracking of the arm and hand | 04-10-2014 |
20140098019 | DEVICE DISPLAY LABEL - Devices and/or accessories for devices, and/or packaging associated with the same comprise a display label configured to display different types of information and content relating to the device and/or accessory to a user or viewer. | 04-10-2014 |
20140098020 | MID-GESTURE CHART SCALING - A system may include presentation of a visualization indicating a first plurality of dimension values and a respective function value for each of the first plurality of dimension values, the function values positioned in accordance with an initial scale of a function value axis, and the first plurality of dimension values positioned in accordance with an initial scale of a dimension value axis, detection of an input gesture to change the indicated first plurality of dimension values, and, before completion of the input gesture, determination of a second plurality of dimension values to indicate in the visualization based on the input gesture, determination of an updated scale of the function value axis based on the respective function values for each dimension value of the second plurality of dimension values, and display of the respective function values for each dimension value of the second plurality of dimension values positioned in accordance with the updated scale. | 04-10-2014 |
20140098021 | OPTICAL NAVIGATING APPARATUS AND COMPUTER READABLE MEDIA CAN PERFORM OPTICAL NAVIGATING METHOD - An optical navigating apparatus, which comprises: a light source, for illuminating a surface to generate an image; an image sensor, for catching pictures of the image; and a controller, for computing a first estimating speed of the optical navigating apparatus according to a first picture of the pictures and a second picture after the first picture. The controller controls at least one of parameters as following according to the first estimating speed: a non-illuminating frequency that the light source does not illuminate pictures after the second picture; a non-catching frequency that the image sensor does not catch pictures after the second picture; a computing frequency that the controller computes pictures after the second picture, which are caught by the image sensor; and a searching range for pictures after the second picture. | 04-10-2014 |
20140104156 | Hand-held wireless electronic device with accelerometer for interacting with a display - A device and method for interacting with a display using an accelerometer sensitive to tilt about two perpendicular axes and a third signal having a magnitude responsive to user input. The device and method allow a user to change a characteristic of an object on the display by converting pitch information, roll information, and a linear dimension into Cartesian coordinates for use by the display. | 04-17-2014 |
20140104157 | TRANSPARENT ANTENNAS ON A DISPLAY DEVICE - This disclosure provides systems, methods and apparatus for a display device with at least one transparent antenna. In one aspect, the transparent antenna is formed on a surface of a transparent substrate and may be electrically reinforced with one or more electrically conductive traces. The transparent antenna can avoid substantial interference with images produced by the display device. | 04-17-2014 |
20140104158 | METHOD AND DEVICE FOR NAVIGATING TIME AND TIMESCALE USING MOVEMENTS - A portable electronic device including a calendar application operable to display a calendar application at a first time and first timescale as well as a second time and second timescale different from the first time and first timescale, and an input unit operable to detect a movement of the portable electronic device, wherein the calendar application proceeds to the second time or second timescale from the first time or first timescale upon detection of the movement. | 04-17-2014 |
20140104159 | INPUT DEVICE AND RELATED METHOD - An input device for transmitting a signal between a terminal device and an external device is disclosed in the present invention. The input device includes an input unit, a terminal communication unit, a wireless transmission unit and a processor. The input unit generates an operation signal. The terminal communication unit establishes a connection to the terminal device. The wireless transmission unit establishes a connection to the external device for receiving a first datum signal outputted from the external device. The processor is electrically connected to the input unit, the terminal communication unit and the wireless transmission unit. The processor respectively transmits the operation signal and the first datum signal to the terminal device via the terminal communication unit. | 04-17-2014 |
20140104160 | REMOVABLE PROTECTIVE COVER WITH EMBEDDED PROXIMITY SENSORS - A removable cover for a handheld electronic device, including a protective cover that at least partially covers rear and side surfaces of a handheld electronic device, a plurality of proximity sensors mounted in the cover for detecting user gestures performed outside of the electronic device, a battery, wireless communication circuitry, and a processor configured to operate the proximity sensors, and to operate the wireless communication circuitry to transmit commands to the electronic device based on gestures detected by the proximity sensors. | 04-17-2014 |
20140104161 | GESTURE CONTROL DEVICE AND METHOD FOR SETTING AND CANCELLING GESTURE OPERATING REGION IN GESTURE CONTROL DEVICE - A method for setting and cancelling a gesture operating region in a gesture control device includes steps of capturing at least one image; detecting whether there is a palm in the at least one image; if there is a palm in the at least one image, detecting whether there is a face in the at least one image; if there is a face in the at least one image, setting the gesture operating region according to the palm and the face; and cancelling the gesture operating region when the palm is at rest over a first time period or the palm disappears from the gesture operating region over a second time period. | 04-17-2014 |
20140104162 | Systems And Methods For Transmitting Haptic Messages - Systems and methods for transmitting haptic messages are disclosed. For example, one disclosed method includes the steps of: receiving at least one sensor signal from at least one sensor of a mobile device, the at least one sensor signal associated with a movement of the mobile device, determining a message to be displayed in a user interface based at least in part on the at least one sensor signal, and causing the message to be displayed. | 04-17-2014 |
20140104163 | INFORMATION INPUT APPARATUS - The present invention is directed to the provision of an information input apparatus that can change the projection position of an input image by tracking a target object and can detect an information input to the input image. More specifically, the invention provides an information input apparatus includes a projection unit which projects an input image, a projection position changing unit which changes the projection position of the input image, a detection sensor which detects the position of a detection target, and an information detection unit which causes the projection position changing unit to change the projection position of the input image by tracking the position of the detection target detected by the detection sensor, and which detects an information input to the input image based on data supplied from the detection sensor. | 04-17-2014 |
20140104164 | Method and Electronic Apparatus for Realizing Virtual Handwriting Input - The embodiments of the present invention provide a method and an electronic apparatus for realizing virtual handwriting input. The method for realizing virtual handwriting input includes: obtaining, by an electronic apparatus, position coordinates of a handwriting apparatus in a virtual handwriting area through monitoring in a real-time manner, where the virtual handwriting area is an area in which a distance-measuring detector in the electronic apparatus may perform effective distance measurement; obtaining a track formed by the position coordinates, and obtaining a handwriting track of an entire handwriting input process when it is detected that the current handwriting input ends; and identifying a character corresponding to the obtained handwriting track and displaying the identified character. With the embodiments of the present invention, handwriting input may be realized simply and conveniently, and handwriting input may be realized and may be implemented with a low cost. | 04-17-2014 |
20140104165 | SYSTEM AND METHOD FOR DISPLAY OF MULTIPLE DATA CHANNELS ON A SINGLE HAPTIC DISPLAY - A system that produces a haptic effect and generates a drive signal that includes at least two haptic effect signals each having a priority level. The haptic effect is a combination of the haptic effect signals and priority levels. The haptic effect may optionally be a combination of the two haptic effect signals if the priority levels are the same, otherwise only the haptic effect signal with the highest priority is used. The frequency of haptic notifications may also be used to generate a drive signal using foreground and background haptic effect channels depending on whether the frequency ratio exceeds a foreground haptic effect threshold. | 04-17-2014 |
20140104166 | FLEXIBLE PORTABLE DEVICE - A flexible portable device including a display unit for displaying an image, a communication unit for performing communication with an external device, a sensor unit for sensing user input or an environment of the flexible portable device, and a control unit for controlling the flexible portable device and the units of the flexible portable device. Further, the sensor unit includes a motion sensor unit for sensing motion of the flexible portable device and/or motion with respect to the flexible portable device, the flexible portable device has at least one flexible area which is bendable, and the motion sensor unit is located at a first area at which influence on the motion sensor unit when the flexible area is bent is avoided or minimized. | 04-17-2014 |
20140104167 | METHOD AND APPARATUS FOR CONTROLLING AN OUTPUT DEVICE OF A PORTABLE ELECTRONIC DEVICE - According to embodiments described in the specification, a method and apparatus are provided for controlling an output device of a portable electronic device comprising a processor, a first motion sensor, a second motion sensor and an output device. The method comprises: receiving at the processor, from the first motion sensor, first motion data representing movement of an external object relative to the portable electronic device; receiving at the processor, from the second motion sensor, second motion data representing movement of the portable electronic device; generating, at the processor, third motion data based on the first and second motion data, the third motion data representing movement of the external object; and, controlling the output device based on the third motion data. | 04-17-2014 |
20140111414 | Instrumented Apparel for the Collection of Kinematic Motion - Apparatus and methods for repetitive and consistent collection, processing, feedback, storage, communication and use of data generated by one or more sensors and feedback devices embedded in apparel or orthodics worn by a user. | 04-24-2014 |
20140111415 | COMPUTING DEVICE WITH FORCE-TRIGGERED NON-VISUAL RESPONSES - In one example, a method includes receiving, by a computing device, an indication of a detected force applied to the computing device. The method further comprises determining, by the computing device, that the detected force matches a corresponding input that the computing device associates with a corresponding function that is executable by the computing device. The method further comprises generating, by the computing device and in response to determining that the detected force matches the corresponding input and, a non-visual output based on the corresponding function. | 04-24-2014 |
20140111416 | ELECTRONIC APPARATUS AND HANDWRITTEN DOCUMENT PROCESSING METHOD - According to one embodiment, an electronic apparatus includes a display processor, a generator and a recognition module. The display processor displays a plurality of strokes with a first line width, the plurality of strokes being input by handwriting. The generator generates a plurality of stroke data corresponding to the plurality of strokes. The recognition module recognizes a character corresponding to one or more strokes of the plurality of strokes using the plurality of stroke data. The display processor displays the one or more strokes with a line width according to a size of the recognized character. | 04-24-2014 |
20140111417 | FLEXIBLE DISPLAY APPARATUS, APPARATUS TO CAPTURE IMAGE USING THE SAME, AND METHOD OF EDITING IMAGE - The present general inventive concept discloses a flexible display apparatus, image capturing apparatus using the same, and an image editing method. The flexible display apparatus according to the present general inventive concept includes a receiver which receives an image, a displayer of which a changing of shape is possible, a sensor for sensing a changing of shape of the displayer, and a controller which converts an image displayed on the displayer according to the changing of shape of the displayer, when a changing of shape of the displayer is sensed. | 04-24-2014 |
20140111418 | METHOD FOR RECOGNIZING USER CONTEXT USING MULTIMODAL SENSORS - There is provided a method for recognizing a user context using multimodal sensors, and the method includes classifying accelerometer data by extracting candidates for movement feature from the accelerometer data collected from an accelerometer, selecting one or more movement features from the extracted candidates for movement feature based on relevance and redundancy thereof, and then inferring a user's movement type based on the selected movement features using a first time-series probability model; classifying audio data by extracting surrounding features from the audio data collected from an audio sensor and inferring the user's surrounding type based on the extracted surrounding features; and recognizing a user context by recognizing the user context based on either of the movement type or the surrounding type. | 04-24-2014 |
20140111419 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND COMPUTER PROGRAM PRODUCT - According to one embodiment, an information processing apparatus includes: a display comprising a screen; a detector configured to detect a first state of the screen when the screen is faced down; and a controller configured to initiate a security locked mode when the first state is detected. | 04-24-2014 |
20140111420 | DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A display apparatus including a recognition unit to recognize a gaze of a user, a controller to determine whether the recognized gaze is within a predetermined recognition region and to control entry into an interactive mode upon determining that the recognized gaze is within the predetermined recognition region, and a display unit to display an image corresponding to the interactive mode. A user's gaze is tracked to perform entry into an interactive mode, thereby easily achieving entry into the interactive mode and performing more intuitive interaction. In addition, a multi-modal interactive mode including a combination of face recognition, voice recognition, and gaze recognition is performed, thereby performing a more extended interactive mode and accurately determining a user command. As a result, functions are correctly performed, thereby improving user convenience. | 04-24-2014 |
20140111421 | ELECTRICAL DEVICE, IN PARTICULAR A TELECOMMUNICATION DEVICE HAVING A PROJECTION UNIT AND METHOD FOR OPERATING AN ELECTRICAL DEVICE - A telecommunication device has: a projection unit for projecting image information on a projection surface situated outside the telecommunications device, the image information being projected onto the projection surface (i) with the aid of at least one optical element in a projection beam path for sequentially constructing an image, and (ii) for momentarily spatially limiting the projection relative to the total area of the image information on the projection surface; and a detection unit. During projection a movement of the projection surface, e.g., a displacement or a rotation, takes place relative to the projection unit. The detection unit detects the movement of the projection surface with the aid of a reflection signal passing along a detection beam path, the projection beam path and the detection beam path coinciding at least in the area of the at least one optical element. | 04-24-2014 |
20140111422 | CONFIGURED INPUT DISPLAY FOR COMMUNICATING TO COMPUTATIONAL APPARATUS - According to various embodiments, an input device is provided for receiving one of a plurality of commands via the manipulation of one or more fingers of a user and sending output commands to a separate device based on the nature of the manipulation. According to one embodiment, the input device is a hand-held tablet and the separate device is a computer. In one embodiment, the system may be used for editing electronic video or audio content. | 04-24-2014 |
20140111423 | MOBILE SYSTEMS INCLUDING IMAGE SENSORS, METHODS OF OPERATING IMAGE SENSORS, AND METHODS OF OPERATING MOBILE SYSTEMS - A mobile system may comprise a three-dimensional (3D) image sensor on a first surface of the mobile system configured to perform a first sensing to detect proximity of a subject and a second sensing to recognize a gesture of the subject by acquiring distance information for the subject; and/or a display device on the first surface of the mobile system to display results of the first sensing and the second sensing. A mobile system may comprise a light source unit; a plurality of depth pixels; and/or a plurality of color pixels. The light source unit, the plurality of depth pixels, or the plurality of color pixels may be activated based on an operation mode of the mobile system. | 04-24-2014 |
20140111424 | Locomotion System and Apparatus - A locomotion system for use with a virtual environment technology includes a platform configured to support a user, a harness support assembly coupled to the platform and extending upwardly from the platform, and a safety harness configured to be worn by the user. The harness support assembly includes a support halo positioned above the platform and extending about a vertical central axis. The safety harness includes an interface structure moveably coupled to the support halo. | 04-24-2014 |
20140111425 | PORTABLE TERMINAL, CONTROL METHOD, AND PROGRAM - A portable terminal includes: an opening and closing portion that opens and closes; an open and close detecting unit that detects open and close of the opening and closing portion; a first processing unit that executes a first process while the open and close detecting unit is detecting that the opening and closing portion is open; and a second processing unit that executes a second process when the open and close detecting unit detects that the opening and closing portion has been closed, and the second process is a process with respect to an execution result of the first process. | 04-24-2014 |
20140111426 | INFORMATION PROCESSING APPARATUS, INPUT TERMINAL SELECTION METHOD, PROGRAM, AND SYSTEM - Provided is an information processing apparatus including an acquisition unit configured to acquire device information on a plurality of connected input terminals, and a selection unit configured to determine an importance degree of the input terminals based on information concerning detection units of the input terminals included in the device information to select the input terminal to be used depending on the importance degree. | 04-24-2014 |
20140111427 | LifeBoard - Series Of Home Pages For Head Mounted Displays (HMD) That Respond to Head Tracking - To assist with hands-free computing, the Head Mounted Display or Headset Computer utilizes a series of user configurable Home Pages that contain the shortcuts and widgets the user wants. This allows the user to design a user interface environment which gives him the information he wants, in the order he wants. | 04-24-2014 |
20140118239 | VISUAL-SYMBOLIC CONTROL OF REMOTE DEVICES HAVING DISPLAY-BASED USER INTERFACES - Traditional, programmatic automated remote control of computerized devices requires extensive tailoring for each device type and operating system. A visual-symbolic control method enables largely device-agnostic control of any target device with access to display and a means of user input (keyboard, mouse, touchpad, touch-screen, etc). An image-processing daemon analyzes the displayed image and recognizes its component visual entities (windows, icons, buttons, etc.), creates symbolic entities from extracted attributes of the visual entities, and organizes the symbolic entities into a symbolic object model instance. The functional relationships and hierarchies of the visual entities are captured in the arrangement of symbolic entities in the symbolic object model instance. Visual-symbolic control commands act on the symbolic entities, and, where appropriate, the commands are transmitted to the target device as user-like target-input. | 05-01-2014 |
20140118240 | Systems and Methods for Configuring the Display Resolution of an Electronic Device Based on Distance - Systems and methods dynamically configure a display ( | 05-01-2014 |
20140118241 | INPUT LOCATION CORRECTION TABLES FOR INPUT PANELS - One or more input location correction tables are used to compensate for interference introduced into input panels and generate a corrected location based on a sensed location of the input panel. The one or more input location correction tables can include a coarse table and a fine table that stores mappings of intermediate locations mapped to by the coarse table having an accuracy that fails to satisfy a threshold coordinate accuracy. Different environments in which computing device can be situated can result in different interference being introduced, and the one or more input location correction tables can be updated based on the current environment to compensate for the interference introduced in the current environment. | 05-01-2014 |
20140118242 | ELECTRONIC DEVICE AND HANDWRITTEN DOCUMENT DISPLAY METHOD - According to one embodiment, an electronic device includes a storage, a first display processor, an input module, and a second display processor. The storage is configured to store stroke information corresponding to a plurality of strokes. The first display processor is configured to display the plurality of strokes. The input module is configured to input a retrieve key for retrieval of the plurality of strokes. The second display processor is configured to display a enlarged or reduced display area including a stroke part, which is retrieved from the stroke information and corresponds to the retrieve key, and strokes other than the stroke part. | 05-01-2014 |
20140118243 | DISPLAY SECTION DETERMINATION - A glasses may include a pupil movement sensor configured to detect a movement of a pupil within an eye, a processor configured to determine a display section based, at least in part, on the movement of the pupil, and a display configured to display the determined display section. | 05-01-2014 |
20140118244 | CONTROL OF A DEVICE BY MOVEMENT PATH OF A HAND - In the invention two consecutive movement paths are detected and displayed content may be manipulated based on the detection of the first movement path of the hand but once a first movement and a second movement path are detected manipulation of the displayed content by the second movement is prevented. Thus, detection of a pair of movement paths prevents unintentional activation of a device. | 05-01-2014 |
20140118245 | INFORMATION DISPLAYING APPARATUS AND INFORMATION DISPLAYING METHOD - Provided is an information displaying apparatus in which even when an image movement operation is received, it is possible to move an image excluding an image desired by a user. The apparatus is able to receive a predetermined selection movement operation for selecting a part or the whole of the image that is being displayed currently as an image to be moved for moving and a predetermined selection operation for selecting a part of the image that is being displayed currently as a movement prohibition image. When the predetermined selection movement operation is received in a state where the movement prohibition image is selected depending on the above-described predetermined selection operation, the apparatus executes display control to move the above-described image to be moved in a state where the above-described movement prohibition image is displayed at a current display position as it is. | 05-01-2014 |
20140118246 | GESTURE RECOGNITION USING AN ELECTRONIC DEVICE INCLUDING A PHOTO SENSOR - An electronic device including a photo sensor and a gesture recognition module, and method for the electronic device are provided. The photo sensor emits infrared light, receives reflected light from an external object to which the IR light has been emitted, and generates reflected light data. Then, the gesture recognition module determines a motion or a cover state of an external object as a predefined gesture input type using the reflected light data. The gesture input type includes at least two or more types of motions or cover inputs indicating different cover states of different durations. | 05-01-2014 |
20140118247 | CONTROL APPARATUS, CONTROL METHOD, PROGRAM, INPUT SIGNAL RECEIVING APPARATUS, OPERATION INPUT APPARATUS, AND INPUT SYSTEM - There is provided a control apparatus including an input signal receiving unit that receives a first input signal generated in accordance with an operation on a first operation receiving unit provided on a first surface of an operation input apparatus and a second input signal generated in accordance with an operation on a second operation receiving unit provided on a second surface, which is different from the first surface, of the operation input apparatus, an opposing information acquisition unit that acquires opposing information indicating which surface of the first surface and the second surface is opposed to an operator of the operation input apparatus, and a processing controller that changes one of first processing corresponding to the first input signal and second processing corresponding to the second input signal based on the acquired opposing information. | 05-01-2014 |
20140118248 | REAL-TIME MOTION RECOGNITION SYSTEM AND METHOD - A system and method that may sense and recognize a motion of a user is provided. The system and method may recognize a variety of motions of the user based on sensing data received from a remote controller. | 05-01-2014 |
20140118249 | ENHANCED CAMERA-BASED INPUT - Enhanced camera-based input, in which a detection region surrounding a user is defined in an image of the user within a scene, and a position of an object (such as a hand) within the detection region is detected. Additionally, a control (such as a key of a virtual keyboard) in a user interface is interacted with based on the detected position of the object. | 05-01-2014 |
20140125574 | USER AUTHENTICATION ON DISPLAY DEVICE - Embodiments are disclosed that relate to authenticating a user of a display device. For example, one disclosed embodiment includes displaying one or more virtual images on the display device, wherein the one or more virtual images include a set of augmented reality features. The method further includes identifying one or more movements of the user via data received from a sensor of the display device, and comparing the identified movements of the user to a predefined set of authentication information for the user that links user authentication to a predefined order of the augmented reality features. If the identified movements indicate that the user selected the augmented reality features in the predefined order, then the user is authenticated, and if the identified movements indicate that the user did not select the augmented reality features in the predefined order, then the user is not authenticated. | 05-08-2014 |
20140125575 | TECHNIQUES FOR UTILIZING A COMPUTER INPUT DEVICE WITH MULTIPLE COMPUTERS - A method includes analyzing data associated with an image captured by a camera to determine a visual orientation and/or eye gaze movement of a user of a computer. Upon detecting that the visual orientation indicates that the user is visually oriented toward a display device associated with the computer, a wireless connection from the computer to a computer input device and from the computer input device to the computer is established or maintained. Upon detecting that the visual orientation indicates that the user is not visually oriented toward the display device associated with the computer, the wireless connection is terminated. Another method accomplishes similar objectives using proximity sensors. | 05-08-2014 |
20140125576 | STORAGE MEDIUM, INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM AND INFORMATION PROCESSING METHOD - A non-limiting example game system includes a game apparatus to which a television functioning as a stationary-type display device is connected. The game apparatus performs game processing basically according to operation data from a controller. An input terminal device functioning as a portable-type display device is kept at a user's hand, for example. A course selecting screen is displayed on a screen of the television and an information presenting screen corresponding to the course selecting screen is displayed on a screen of the input terminal device. In the information presenting screen, a comment input by another user is displayed near a course mark, and a designating object which designates the presence of the comment is displayed at a corresponding position in the course selecting screen. | 05-08-2014 |
20140125577 | DISTANCE BASED MODELLING AND MANIPULATION METHODS FOR AUGMENTED REALITY SYSTEMS USING ULTRASONIC GLOVES - User input gloves and input methods are described that are well suited to provide input to computer modeling (eg CAD) and augmented reality (AR) systems, including wearable AR and spatial AR. Each glove comprises palm mounted ultrasonic transducers, accelerometers, finger based pinch inputs and a wireless communication module. The gloves can be used to measure distances over the natural range of distances that hands can be placed, as well as their orientation, with sufficient resolution to facilitate a range of gesture based input methods to be developed and utilized, including distance-based modeling by measurement. Further the gloves are light weight, allow fast input of modeling measurements, are easy to use, and reduce fatigue compared to existing glove based input systems. The user input gloves, and associated input techniques can be used to measure small and body sized objects using one or two hands, and large objects can be measured using single handed measurements. Further models for both small and large objects can be generated and manipulated through the use of a numeric input technique to obtain an amplification factor to magnify the effective distances measured. | 05-08-2014 |
20140125578 | Display Panel And Operation Control Method Thereof, And Display Device - A display panel, an operation control method thereof and a display device for improving users' experience of display products. Wherein the display panel comprises a flexible substrate ( | 05-08-2014 |
20140125579 | HEAD MOUNTED DISPLAY, MOTION DETECTOR, MOTION DETECTION METHOD, IMAGE PRESENTATION SYSTEM AND PROGRAM - Disclosed herein is a head mounted display including: an enclosure accommodating a presentation section adapted to present a three-dimensional image and located in front of the eyes of a viewer when worn on the head of the viewer; and an imaging element provided in the enclosure and adapted to turn light external to the enclosure into an image, in which the imaging element images light that is vertically downward relative to the enclosure and forward in the direction of line of sight of the viewer when the enclosure is worn on the head of the viewer. | 05-08-2014 |
20140125580 | METHOD AND DEVICE FOR PROVIDING INFORMATION REGARDING AN OBJECT - A method of providing information related to an object is provided. The method includes sensing a gesture related to the object located in a certain area via a first device, obtaining content related to the object according to the sensed gesture, and transmitting the obtained content to a second device that is connected with the first device in a network, wherein the content is output via the second device. | 05-08-2014 |
20140125581 | Individual Task Refocus Device - The subject matter disclosed herein provides methods for detecting a user's loss of focus while reading content from an electronic device, paper, or other medium and prompting the user to refocus his/her concentration. This method can maintain a user profile that includes alert parameters. The alert parameters can include a prompt timer period and a type of prompt to be generated. The prompt timer period can specify a maximum allowable gaze time before one or more prompts are generated. The method can determine whether data received from a device tracking a movement by one or both eyes of a user is representative of active reading of content viewed by the user. One or more prompts can be generated based on the determining. Related apparatus, systems, techniques, and articles are also described. | 05-08-2014 |
20140125582 | Gesture Recognition Apparatus and Method of Gesture Recognition - One embodiment of the invention discloses a gesture recognition apparatus including a left source configured to generate a left basic signal, a right source configured to generate a right basic signal, a detector configured to detect the left basic signal, the right basic signal and Doppler shift signals of the left and right basic signals after reflection by a hand, the detector disposed between the left source and the right source, and a processer configured to deal with signals from the detector and recognize a gesture. And a method of gesture recognition is also disclosed. | 05-08-2014 |
20140125583 | VEHICULAR DISPLAY SYSTEM - A vehicular display system includes a mobile device having a mobile display part and adapted to be disposed in a predetermined position in a vehicle. The mobile display part is configured to display contents such that at least one of a display size and a display position of each of the contents is changed in accordance with positions of eyes of a driver sitting on a driver's seat in the vehicle with respect to a reference position in the vehicle. Preferably, the mobile device has an imaging unit for picking up an image and recognizes the positions of the driver's eyes and the reference position from the image picked up by the imaging unit. The vehicular display system may further include a vehicle on-board unit fixedly mounted in the vehicle. | 05-08-2014 |
20140125584 | SYSTEM AND METHOD FOR HUMAN COMPUTER INTERACTION - A system and method for human computer interaction (HCI). may acquire image data, determine an interaction intended to be conducted by a user based on various gestures and poses of the user detected from the image data, and perform an operation and/or function which is displayed on a display of a display unit, in response to a result of the interaction. | 05-08-2014 |
20140125585 | COMPUTER DEVICE OPERABLE WITH USER'S EYE MOVEMENT AND METHOD FOR OPERATING THE COMPUTER DEVICE - The present invention relates to a method for operating a computer device with user's eye movement. The method comprises the steps of detecting the user's eye movement, analyzing the user's eye movement to specify an eye movement pattern in the detected user's eye movement and a time period for completing the eye movement pattern, determining a command associated with a combination of the eye movement pattern and the time period and operating the device according to the command. | 05-08-2014 |
20140125586 | Haptic Automated Communication System - A haptic communication system having a range of sensors embedded with an operator's attire. Data collected by the sensors is processed by a computing device local to the operator and is communicated via a haptic modality in real-time to other team members and robotic assets in the system. | 05-08-2014 |
20140125587 | APPARATUSES AND METHODS FOR PROVIDING A 3D MAN-MACHINE INTERFACE (MMI) - An electronic apparatus having at least two camera devices and a processing device. The processing device: (1) determines a first length between an object positioned at a first time and a surface formed by the two camera devices, (2) determines a second length between the object positioned at a second time and the surface, (3) determines a third length between the object positioned at a third time and the surface, and (4) determines a depth in a virtual space corresponding to the object positioned at the third time according to the first length, the second length, and the third length. In operation, the third time is later than the first time and the second time, and the third length is longer than the first length and shorter than the second length. | 05-08-2014 |
20140132497 | TOUCHABLE MOBILE REMOTE CONTROL WITHOUT DISPLAY - A mobile remote control is provided. The mobile remote control includes a touch surface without a display and is adapted to an electronic device with a display. Each touch event on the touch surface is reflected on the display directly relative to the position and nature of the event on the touch surface. The size of the touch surface is small enough so that user's fingers can be around it. The sensitivity of the palm makes it possible for user to know the relatively position of his finger on the touch surface without gazing at it. | 05-15-2014 |
20140132498 | REMOTE CONTROL USING DEPTH CAMERA - Embodiments for using a depth camera to emit remote control signals are provided. In one example embodiment, a depth camera includes an infrared light to emit infrared light at a physical space, an infrared camera to receive infrared light from the physical space, an imaging interface to output a depth map derived from the infrared light, and a control interface to receive an instruction indicating a remote control signal to emit from the infrared light. | 05-15-2014 |
20140132499 | DYNAMIC ADJUSTMENT OF USER INTERFACE - Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance. | 05-15-2014 |
20140132500 | METHOD AND APPARATUS FOR RECOGNIZING LOCATION OF MOVING OBJECT IN REAL TIME - Provided is an apparatus for recognizing a location of a moving object in real time. The apparatus includes a lighting portion including a lighting unit for emitting a plurality of optical signals spatially separated in different patterns; an electronic tattoo portion that is attached on the moving object, wherein the electronic tattoo portion includes at least one electronic tattoo unit including a solar cell for detecting each of the plurality of optical signals emitted from the lighting portion, a wireless antenna for wirelessly transmitting the plurality of optical signals detected by the solar cell, and a controller for controlling the transmission of the plurality of optical signals; and a location recognition portion for recognizing the location of the moving object from the plurality of optical signals wirelessly transmitted from the electronic tattoo portion. | 05-15-2014 |
20140132501 | METHOD AND APPARATUS FOR PROJECTING PATTERNS USING STRUCTURED LIGHT METHOD - This specification provides a method of projecting patterns using a structured light method and a stereo vision apparatus using the same. The apparatus includes a Laser Diode (LD), an optical splitter configured to receive a light source from the LD and copy a plurality of light sources having identical characteristics with the light source of the LD, and a plurality of diffusers configured to receive the plurality of light sources copied by the optical splitter and optically project patterns. | 05-15-2014 |
20140132502 | METHOD FOR QUICKLY SELECTING A FUNCTION FROM A SMART DEVICE - A method for quickly selecting a function from a smart device having a CPU and motion sensor. The method includes the steps: detecting an action mode of the smart device by the motion sensor and transmitting an action signal corresponding to the action mode to the CPU; matching the action signal to a predetermined function corresponding to the action signal by the CPU; and executing the predetermined function by the CPU according to an execution signal. In the method, the action data corresponding to the action modes of the smart device are built in an action database, and the predetermined function corresponding to the action mode of the smart device is executed in accordance with the action data by means of the motion sensor of the smart device which detects the action mode of the smart device so that the predetermined function can be quickly switched to and then executed. | 05-15-2014 |
20140132503 | Optical Control of Display Screens - A display can be modified to be activated remotely with optical signals. A laser diode of one frequency can be used for tracking and targeting, with another frequency used to trigger the equivalent of a touch, click, drag, or other conventional input commands. Signal processing correlates trigger signals of specific commands, which can be expanded to include multi-click, multi-touch, gestures, and even custom commands specific to individual users. Modern capacitive touch screens can be upgraded with a transparent film overlay to provide these capabilities, which has never before been possible. Related systems, apparatus, methods, and articles are also described. | 05-15-2014 |
20140132504 | METHOD FOR CONTROLLING AND DISPLAYING ELECTRONIC APPARATUS - Methods and systems for enabling a first device to present a user interface on a second device are described. The first and second devices are able to communicate via a network. A change is detected in an operational state of the first device. Responsive to the detected change, a message is sent from the first device. The message identifies computer readable instructions stored on the first device. The message is received at second device and, responsive to the receiving, the computer readable instructions are retrieved from the first device. The second device then uses the computer readable instructions whereby to present a user interface at the second device, at least a part of the user interface being determined by the computer readable instructions retrieved from the first device. | 05-15-2014 |
20140132505 | Multimodal interactions based on body postures - In one example, a method for multimodal human-machine interaction includes sensing a body posture of a participant using a camera ( | 05-15-2014 |
20140132506 | BENDING THRESHOLD AND RELEASE FOR A FLEXIBLE DISPLAY DEVICE - A flexible display device and method for detecting a bend on the device are discussed. The method includes detecting a first bend on the flexible display device and measuring a first degree of the first bend; recognizing the first bend as a first valid flex input when the measured first degree of bend surpasses a first error threshold value; and recognizing an end of the first valid flex input when the measured first degree of bend falls below a first release threshold value, wherein a maximum degree of bend measured during the first bend is recognized as a first peak value, wherein the first error threshold value is determined based on a degree before the flexible display device is bent and the first release threshold value is determined based on the first peak value, and wherein the first release threshold value is different from the first error threshold value. | 05-15-2014 |
20140132507 | BLUETOOTH OR OTHER WIRELESS INTERFACE WITH POWER MANAGEMENT FOR HEAD MOUNTED DISPLAY - A headset computer that includes a wireless front end that interprets spoken commands and/or hand motions and/or body gestures to selectively activate subsystem components only as needed to carry out specific commands. | 05-15-2014 |
20140132508 | Electronic Devices With Gaze Detection Capabilities - An electronic device may have gaze detection capabilities that allow the device to detect when a user is looking at the device. The electronic device may implement a power management scheme using the results of gaze detection operations. When the device detects that the user has looked away from the device, the device may dim a display screen and may perform other suitable actions. The device may pause a video playback operation when the device detects that the user has looked away from the device. The device may resume the video playback operation when the device detects that the user is looking towards the device. Gaze detector circuitry may be powered down when sensor data indicates that gazed detection readings will not be reliable or are not needed. | 05-15-2014 |
20140132509 | VIRTUAL INPUT SYSTEM - For a user having a user input actuator, a virtual interface device, such as for a gaming machine, for determining actuation of a virtual input by the input actuator is disclosed. The device comprises a position sensing device for determining a location of the user input actuator and a controller coupled to the position sensing device, the controller determining whether a portion of the user input actuator is within a virtual input location in space defining the virtual input. | 05-15-2014 |
20140139420 | HUMAN INTERACTION SYSTEM BASED UPON REAL-TIME INTENTION DETECTION - A system for human interaction based upon intention detection. The system includes a sensor for providing information relating to a posture of a person detected by the sensor, a processor, and a display device. The processor is configured to receive the information from the sensor and process the received information in order to determine if an event occurred. This processing includes determining whether the posture of the person indicates a particular intention, such as attempting to take a photo. If the event occurred, the processor is configured to provide an interaction with the person via the display device such as displaying a message or the address of a web site. | 05-22-2014 |
20140139421 | DEVICE HAVING VARIABLE-INPUT SELECTOR FOR ELECTRONIC BOOK CONTROL - This document describes techniques and apparatuses concerning e-book-reading devices having integral variable-input manual selectors through which selection to control an electronic book is enabled. One example of this control includes navigating pages of an electronic book through flipping pages. Control of this page flipping can be made through a single implement of the variable-input manual selector, such as to slow, speed up, stop, or reverse the page flipping. | 05-22-2014 |
20140139422 | User Gesture Input to Wearable Electronic Device Involving Outward-Facing Sensor of Device - In one embodiment, a wearable apparatus includes a sensor, a processor coupled to the sensor, and a memory coupled to the processor that includes instructions executable by the processor. When executing the instructions, the processor detects by the sensor movement of at least a portion of an arm of a user; detects, based at least in part on the movement, a gesture made by the user; and processes the gesture as input to the wearable apparatus. | 05-22-2014 |
20140139423 | Gesture Recognition Module and Gesture Recognition Method - A gesture recognition module for recognizing a gesture of a user is disclosed. The gesture recognition module comprises an image capturing unit, for capturing a first pixel value, a second pixel value, a third pixel value and a fourth pixel value sequentially of images of the gesture of the user; a computing unit, coupled to the image capturing unit, for determining a first minimum surrounding shape comprising a first pixel difference between the first pixel value and the second pixel value, and determining a second minimum surrounding shape comprising a second pixel difference between the third pixel value and the fourth pixel value; and a determining unit, coupled to the computing unit, for determining the gesture according to a relation between the first minimum surrounding shape and the second minimum surrounding shape. | 05-22-2014 |
20140139424 | FACIAL EXPRESSION CONTROL SYSTEM, FACIAL EXPRESSION CONTROL METHOD, AND COMPUTER SYSTEM THEREOF - A facial expression control system, a facial expression control method, and a computer system thereof are disclosed. The facial expression control system includes a face detection module, a database, and a processing module. The face detection module is used for determining whether a facial expression feature is detected in a captured image. The database is used for storing a control parameter table, wherein the control parameter table is corresponding to the facial expression feature. If the face detection module detects the facial expression feature, the control parameter table of the database is accessed according to the facial expression feature to get an encode signal and transmitted to the processing module. The processing module is used for generating a control signal according to the encode signal. | 05-22-2014 |
20140139425 | IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, IMAGE CAPTURE APPARATUS AND COMPUTER PROGRAM - An image processing apparatus includes a usage state determination portion determining a usage state of a camera portion; and an object detection portion detecting an object from an image captured by the camera portion using a plurality of methods, in which the object detection portion detects the object from the captured image by prioritizing the plurality of methods based on the usage state. | 05-22-2014 |
20140139426 | SmartLight Interaction System - The conference room automation apparatus employs a processor-based integrated movement sensor, lights, cameras, and display device, such as a projector, that senses and interprets human movement within the room to control the projector in response to that movement and that captures events occurring in the room. Preferably packed in a common integrated package, the apparatus employs a layered software/hardware architecture that may be readily extended as a platform to support additional third-party functionality. | 05-22-2014 |
20140139427 | DISPLAY DEVICE - A portable display device includes a display unit, a first capturing unit, and a second capturing unit. The display unit includes a rectangular display screen for displaying an image. The first capturing unit is configured to capture an image of an object. The first capturing unit is arranged in a region, corresponding to a first side of the display screen, which is a part of a peripheral region of the display unit other than the display screen. The second capturing unit is configured to capture an image of the object. The second capturing unit is arranged in a region, corresponding to a second side adjacent to the first side, which is a part of the peripheral region. | 05-22-2014 |
20140139428 | DEVICE AND METHOD FOR DETECTING POSITION CODE - A position code consists of a plurality of cyclic main number sequences that are each shifted by a predetermined amount and arranged in a one-dimensional direction, each of the cyclic main number sequences consisting of cycles of a main number sequence that has a base number b and a length K and having K types of characteristic partial sequences with a predetermined length. A detection section is configured to detect, from each of at least two adjacent cyclic main number sequences among the plurality of cyclic main number sequences, a partial sequence having a length N and satisfying “K+K*(b−1)*N≦b | 05-22-2014 |
20140139429 | SYSTEM AND METHOD FOR COMPUTER VISION BASED HAND GESTURE IDENTIFICATION - The invention relates to a method for computer vision based hand gesture device control, which includes receiving 2D and 3D image information of a field of view which includes at least one user. An area of the user's hand is determined based on the 3D information and a shape of the user's hand is determined based on the 2D information. The detected shape of the hand and the position of the hand are then used to control a device. | 05-22-2014 |
20140145925 | DEVICE AND METHOD FOR PERFORMING A FUNCTIONALITY - A device and method performs a predefined functionality. The method includes generating a first view including selectable elements corresponding to a predefined functionality with first position data and first orientation data shown on a display device. The method includes determining a movement of the display device relative to a user. The method includes generating a second view including at least one of the selectable elements with second position data and second orientation data based on the first image data and the movement data, the second position and orientation data being the same as the first position and orientation data, respectively, relative to the user. The method includes selecting one of the at least one of the selectable elements of the second view as a function of the second position data and the second orientation data and executing the predefined functionality corresponding to the selected one of the selectable elements. | 05-29-2014 |
20140145926 | DISPLAY APPARATUS WITH STICTION REDUCTION FEATURES - A display apparatus includes a plurality of electromechanical systems (EMS) devices disposed on a first surface defined by a face of a substrate. Each EMS device includes a component which is movable in a plane that is substantially parallel to the first surface. The apparatus also includes a second surface positioned proximate to the substrate such that the plurality of EMS devices are located between the first surface and the second surface. In addition, each EMS device includes at least one anti-stiction projection positioned between the movable component and the second surface. | 05-29-2014 |
20140145927 | Method for Providing Identification Information and Related Electronic Device - A method for providing identification information of an electronic device having a screen comprises demonstrating a plurality of selecting objects via the screen when a first button of the electronic device is detected to be activated for a predefined period, and displaying the identification information of the electronic device via the screen when a first selecting object of the plurality of selecting objects is triggered. | 05-29-2014 |
20140145928 | ELECTRONIC APPARATUS AND DATA PROCESSING METHOD - According to one embodiment, an electronic apparatus includes a display processor, a determiner, and a presentation module. The display processor displays a plurality of strokes on a screen. The determiner determines whether a stroke for which a first process is executable exists in at least a part of a plurality of first strokes designated by a selected range on the screen. The presentation module presents to an user that the first process is executable, if it is determined that the stroke for which the first process is executable exists in at least the part of the plurality of first strokes. | 05-29-2014 |
20140145929 | CROSS-USER HAND TRACKING AND SHAPE RECOGNITION USER INTERFACE - Embodiments include vision-based interfaces performing hand or object tracking and shape recognition. The vision-based interface receives data from a sensor, and the data corresponds to an object detected by the sensor. The interface generates images from each frame of the data, and the images represent numerous resolutions. The interface detects blobs in the images and tracks the object by associating the blobs with tracks of the object. The interface detects a pose of the object by classifying each blob as corresponding to one of a number of object shapes. The interface controls a gestural interface in response to the pose and the tracks. | 05-29-2014 |
20140145930 | Computer Interface Employing a Manipulated Object with Absolute Pose Detection Component and a Display - A system with a remote control or wand equipped with a relative motion sensor that outputs data indicative of a change in position. The system has one or more light sources and a photodetector that detects their light and outputs data indicative of the detected light. One or more controllers are used to determine the absolute position of the wand based on the data output by the relative motion sensor and by the photodetector. The wand's absolute pose is determined from the data and includes the absolute position of a reference point on the wand and the wand's absolute orientation. A display is used to show an image defined by two orthogonal axes, e.g., those of world coordinates (X | 05-29-2014 |
20140145931 | APPARATUS AND METHOD FOR CONTROLLING MULTI-MODAL HUMAN-MACHINE INTERFACE (HMI) - Provided are an apparatus and method for controlling a multi-modal human-machine interface (HMI) generating a multi-modal control signal based on the voice information and the gesture information, selecting an object from among at least one object recognized in a direction of an LOS of the user based on the multi-modal control signal, and displaying object-related information associated with the selected object based on the multi-modal control signal. | 05-29-2014 |
20140145932 | CONTROL SYSTEM FOR NAVIGATING A PRINCIPAL DIMENSION OF A DATA SPACE - Systems and methods are described for navigating through a data space. The navigating comprises detecting a gesture of a body from gesture data received via a detector. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting comprises identifying the gesture using the gesture data. The navigating comprises translating the gesture to a gesture signal, and navigating through the data space in response to the gesture signal. The data space is a data-representational space comprising a dataset represented in the physical space. | 05-29-2014 |
20140145933 | DISPLAY AND METHOD CAPABLE OF MOVING IMAGE - A display capable of moving an image determines whether an input signal is coordinate information according to a gesture of a hand or an external input signal. The display calculates, if the input signal is coordinate information, the coordinate that is moved according to the gesture of the hand. After determining whether the hand gesture is a lateral movement, the display generates, if the hand gesture is a lateral movement, a control signal to move a second image mover to the left or right side. A rotator that is included in the display moves the second image mover according to the control signal and displays the image at a moved position through the moved second image mover. | 05-29-2014 |
20140145934 | PERCEPTUAL REACTION ANALYZER, AND METHOD AND PROGRAM THEREOF - A perceptual reaction analyzer transmits content to the terminals, receives the perceptual reaction information, generates perceptual reaction change information, estimates the presence/absence of interest of the users based on the perceptual reaction change information to classify the users into groups corresponding to the presence/absence of the interest, generates a certainty level which indicates a degree of certainty of the presence/absence of interest, and tries, for a low certainty user, an operation on the content corresponding to the perceptual reaction by which the same presence/absence of interest of the low certainty user is estimated again, based on the perceptual reaction information of a user of which presence/absence of interest is the same as the low certainty user. The perceptual reaction information receiving, the perceptual reaction change information generating and the user grouping are performed after the trial processing, so as to re-estimate the presence/absence of interest of the low certainty user. | 05-29-2014 |
20140145935 | SYSTEMS AND METHODS OF EYE TRACKING CONTROL ON MOBILE DEVICE - Methods and systems to facilitate eye tracking control on mobile devices are provided. An image of a portion of a user is received at an eye tracking device, where the image includes reflections caused by light emitted on the user from one or more light sources located within the eye tracking device. One or more eye features associated with an eye of the user is detected using the reflections. Point of regard information is determined using the one or more eye features, where the point of regard information indicates a location on a display of a computing device coupled to the eye tracking device at which the user was looking when the image of the portion of the user was taken. The point of regard information is sent to an application capable of performing a subsequent operation using the point of regard information. | 05-29-2014 |
20140145936 | METHOD AND SYSTEM FOR 3D GESTURE BEHAVIOR RECOGNITION - A method for 3D gesture behavior recognition is disclosed, which includes detecting a behavior change of one or more attendees at a meeting and/or conference; classifying the behavior change; and performing an action based on the behavior change of the one or more attendees. Another method, system and computer readable medium for 3D gesture behavior recognition as disclosed, includes obtaining temporal segmentation of human motion sequences for one or more attendees; determining a probability density function of the temporal segmentations of the human motion sequences using a Parzen window density estimation model; computing a bandwidth for determination of a median absolute deviation; updating the Parzen window to adapt for changes in the motion sequences for the one or more attendees; and detecting actions based. | 05-29-2014 |
20140145937 | METHOD AND DEVICE FOR AUTOMATIC CONTROL - A method for automatic control is provided, which is applied to an electronic apparatus including a first controller, a second controller, a first input apparatus and a first display unit. The first input apparatus is rotatablely connected to the first display unit. When the electronic apparatus is in a sleeping state, the second controller is in a sleeping state and the first controller is in an operating state. The method includes: obtaining a first angle between the first input apparatus and the first display unit in the case that the electronic apparatus is in the sleeping state; determining whether the first angle satisfies a first preset condition; and controlling the first input apparatus to be in a turn-off state by the first controller in the case that the first angle satisfies a first preset condition. | 05-29-2014 |
20140145938 | ELECTRONIC DEVICE AND INFORMATION PROCESSING METHOD THEREOF - An electronic device and an information processing method thereof are provided. When a state of a relative position between a first body and a second body is an open state and a sensor is started, an initial angle between the first body and the second body that is formed through support of the connector is collected by the sensor, a variation between a first position of the first body relative to the second body and a second position to which the first body moves relative to the second body under an external force is collected by the sensor, and finally a display parameter is controlled according to the variation. Therefore, the solutions provided by the invention do not need an additional auxiliary apparatus such as a keypad/pedal thereby making the operation easier and more convenient for the operator. | 05-29-2014 |
20140145939 | BIDIRECTIONAL DISPLAY AND TRIGGERING THEREOF - The present invention relates to a bidirectional display having a two-dimensional display array comprising a plurality of light-generating pixels and a two-dimensional camera array comprising a plurality of light-detecting elements, wherein the two arrays can each be electrically triggered line by line and are preferably interleaved in at least in some sections, featuring electrical triggering of the display array and of the camera array wherein, during light generation in a line of the display array, light detection with that line of the camera array which is closest to said line is deactivated, i.e. line-sequential electrical triggering of the bidirectional display. | 05-29-2014 |
20140145940 | Display dimming in response to user - In some embodiments a detector is to detect a body of a user. A controller is to determine an area of focus of the user in response to the detector, and to dim a portion of a display that is not in the area of focus. Other embodiments are described and claimed. | 05-29-2014 |
20140145941 | COMPUTER VISION GESTURE BASED CONTROL OF A DEVICE - A system and method are provided for controlling a device based on computer vision. Embodiments of the system and method of the invention are based on receiving a sequence of images of a field of view; detecting movement of at least one object in the images; applying a shape recognition algorithm on the at least one moving object; confirming that the object is a user hand by combining information from at least two images of the object; and tracking the object to control the device. | 05-29-2014 |
20140145942 | VIDEO REPRODUCTION APPARATUS AND VIDEO REPRODUCTION METHOD - According to one embodiment, a video reproduction apparatus includes an image generator, a motion recognizer, a marker generator and image synthesizer. The image generator is configured to generate a first pair of images with a difference in visual field for an operational object. The motion recognizer is configured to recognize three-dimensional gesture of a user. The marker generator is configured to identify three-dimensional designated coordinates based on the gesture recognized by the motion recognizer, and generate a second pair of images with a difference in visual field for a marker corresponding to the designated coordinates. The image synthesizer is configured to synthesize the first pair of images with the second pair of images to generate a third pair of images. | 05-29-2014 |
20140145943 | SENSOR-BASED USER INTERFACE CONTROL - Methods and devices for sensor-based user interface control are disclosed. In one embodiment, a method for determining a characteristic of handedness includes sensing a rotation of a mobile device, determining a direction of rotation based at least in part on accessing information indicative of a first position state prior to sensing the rotation and accessing information indicative of a second position state subsequent to sensing the rotation, and determining the characteristic of handedness based at least in part on the direction of rotation, the first position state, and the second position state. The characteristic of handedness includes one of a left handedness or right handedness. The method further includes determining a user interface mode based on the characteristic of handedness determined, and controlling the mobile device in accordance with the user interface mode determined. | 05-29-2014 |
20140152537 | METHOD AND DEVICE FOR IDENTIFYING CONTACTLESS GESTURES - Methods and devices for identifying contactless gestures are described. In one example, the present disclosure describes a method of detecting a contactless gesture on an electronic device. The electronic device has an electromagnetic radiation transmitter and an electromagnetic radiation receiver. The electromagnetic radiation receiver is configured for receiving electromagnetic radiation emitted from the electromagnetic radiation transmitter and reflected by an object. The method includes: monitoring an amplitude of received electromagnetic radiation at the electromagnetic radiation receiver; detecting a proximity event by comparing the amplitude to a predetermined proximity threshold; after detecting the proximity event, continuing to monitor the amplitude of the received electromagnetic radiation at the electromagnetic radiation receiver; and, in response to detecting the proximity event, performing an analysis on the received electromagnetic radiation to determine whether the received electromagnetic radiation indicates a predetermined gesture. | 06-05-2014 |
20140152538 | View Detection Based Device Operation - Methods and apparatuses for peripheral device operation are disclosed. In one example, a user viewing direction is detected corresponding to the user viewing a first display or a second display. Responsive to the user viewing direction, a peripheral device is operated with a first device or a second device. | 06-05-2014 |
20140152539 | APPARATUS AND METHOD FOR AN INFRARED CONTACTLESS GESTURE SYSTEM - An apparatus, a method, and a computer program product are provided. The apparatus generates at least one signal in response to a contactless gesture performed proximate to an infrared proximity sensor situated on the optical device, the contactless gesture corresponding to a command, identifies the command using the at least one signal, and executes the command. | 06-05-2014 |
20140152540 | GESTURE-BASED COMPUTER CONTROL - Instead of controlling a computer by means of a hardware controller (e.g., keyboard, mouse, remote control, etc.), the present application provides systems and methods for a user to control a computer by performing certain body movements or postures (herein referred to as “gestures”) that are recognized by the computer and translated into computer commands. | 06-05-2014 |
20140152541 | Card-Stack Interface - In one embodiment, a method includes displaying one of a number of graphical user interfaces (GUIs) of one or more applications as a card on top of a card stack. One or more of the cards in the card stack corresponds to a GUI of a home screen of the computing device. Each of one or more of the cards in the card stack corresponds to one of the GUIs of an application. The application controls presentation of their GUIs as cards in the card stack. The method also includes receiving user input to display another one of the GUIs as the card on top of the card stack; and, in response to the user input, displaying the other one of the GUIs as the card on top of the card stack. | 06-05-2014 |
20140152542 | APPARATUS, SYSTEM AND METHOD FOR FLEXIBLE TACTILE COMPUTER INPUT - An apparatus, system, and method are disclosed for flexible tactile computer input. A surface module may define a pressure-sensitive surface, any physical surface, or a virtual surface of a flexible input device, the surface having a default shape, reshaping the surface in response to physical proximity to one or more external objects, including one or more bodily appendages of a user, and restoring the default shape when the surface is not in physical proximity to the objects. A sensing module may sense manipulation of the flexible input device by the bodily appendages in proximity to the surface. An analysis module may interpret the manipulation of the flexible input device as input to a computer. | 06-05-2014 |
20140152543 | SYSTEM, DATA PROVIDING METHOD AND ELECTRONIC APPARATUS - According to one embodiment, a system includes a receiver, a controller, and a transmitter. The receiver receives a plurality of stroke data, which correspond to a plurality of strokes handwritten on first contents in a plurality of terminals, from the plurality of terminals. The controller stores the plurality of stroke data in a storage medium, the plurality of stroke data associated with an identifier of a terminal which is a sender. The transmitter transmits to a first terminal of the plurality of terminals one or more first stroke data of the plurality of stroke data. The one or more first stroke data are stroke data associated with an identifier of a terminal other than the first terminal. | 06-05-2014 |
20140152544 | DISPLAYING SYSTEM, DISPLAY CONTROLLER, STORAGE MEDIUM AND METHOD - An exemplary display system includes: a first display device; a second display device; and a display control unit that controls one of the first display device and the second display device to display a map image corresponding to a given position, and controls the other of the first display device and the second display device to display a partial image which is a part of a panoramic image corresponding to the position. | 06-05-2014 |
20140152545 | DISPLAY DEVICE AND NOTIFICATION METHOD - According to one embodiment, a display device includes: a display; an audio output module; a recognition module configured to recognize a time-series motion of a user; an issuing module configured to issue an operating instruction when the recognized time-series motion is a predetermined time-series motion; and a notification module configured to, when the time-series motion is recognized, notify the user of the recognized time-series motion by at least one of outputting of audio and displaying of an image on the display, regardless of whether the operating instruction is issued based on the recognized time-series motion. | 06-05-2014 |
20140152546 | CONTROL SYSTEM FOR NAVIGATING A PRINCIPAL DIMENSION OF A DATA SPACE - Systems and methods are described for navigating through a data space. The navigating comprises detecting a gesture of a body from gesture data received via a detector. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting comprises identifying the gesture using the gesture data. The navigating comprises translating the gesture to a gesture signal, and navigating through the data space in response to the gesture signal. The data space is a data-representational space comprising a dataset represented in the physical space. | 06-05-2014 |
20140152547 | USER ACCESS CONTROL BASED ON HANDHELD DEVICE ORIENTATION - Disclosed is a novel system, computer program product, and method for allowing access to an application on a handheld device. This is also known as logging on or password entry. The method begins with detecting a change in at least one of orientation and position of a handheld device relative to a given plane. At least one of a keyboard, a touch screen, a gesture, and voice recognition engine input is received. Based on a combination of the at least one of orientation and position of the handheld and the user input received matching a previously stored value, unlocking access to an application running on the handheld device. The detecting of the change in orientation or position or both can occur simultaneously with the user input or previous to the user input or after the user input. | 06-05-2014 |
20140152548 | DATA PROCESSING APPARATUS AND RECORDING MEDIUM - A data processing apparatus includes an operation definition management unit that manages definition data in which a pattern of a predetermined operation input including a specification of an object to which an effect is generated is corresponded with an effect; an operation input accepting unit that accepts an operation input by a user; an operation input recognizing processing unit that recognizes the operation input accepted by the operation input accepting unit by referring to the definition data; and an effect control unit that performs the effect of the definition data corresponding to the operation input recognized by the operation input recognizing processing unit. | 06-05-2014 |
20140152549 | SYSTEM AND METHOD FOR PROVIDING USER INTERFACE USING HAND SHAPE TRACE RECOGNITION IN VEHICLE - A system and method of manipulating a user interface using hand shape trace recognition within a vehicle includes receiving, by a controller, an input of a passenger image and recognizing the hand shape trace information from the passenger image. In addition, the controller is configured to select a vehicle device manipulation that corresponds to the recognized hand shape trace. Therefore, when a passenger manipulates a steering wheel with one hand while viewing the front side, various electronic devices within the vehicle are controlled with a motion of one hand. | 06-05-2014 |
20140152550 | PRECISION POSITION TRACKING DEVICE - Embodiment of a lightweight and compact wireless precision position tracking device and a precision position tracking motion capture system are described. Optionally, the wireless precision position tracking device is configured to be worn by a user. The wireless precision position tracking device may be configured to emit optical light from two or more respective markers, where the light from one of the markers is distinguishable from light from another of the markers. | 06-05-2014 |
20140152551 | VEHICLE GESTURE RECOGNITION SYSTEM AND METHOD - Embodiments of vehicle gesture recognition systems and methods are disclosed. An example vehicle gesture recognition system comprises a data interface configured for receiving 2d image data from a 2d sensor and/or from a portable device camera via a portable device interface. Additionally or alternatively, the data interface is configured for receiving gesture data indicating a gesture. A vehicle processing unit is configured for controlling user interfacing with a user interface based on the gestures recognized from the 2d image data and/or as indicated by the gesture data. | 06-05-2014 |
20140152552 | DETECTING USER INTENT TO REMOVE A PLUGGABLE PERIPHERAL DEVICE - A method includes receiving, from a three-dimensional (3D) sensing device coupled to a computer, a sequence of 3D maps including at least part of a hand of a user positioned in proximity to the computer. In embodiments of the present invention, the computer is coupled to one or more peripheral devices, and upon identifying, in the sequence of 3D maps, a movement of the hand toward a given peripheral device, an action preparatory to disengaging the given peripheral device is initiated. | 06-05-2014 |
20140152553 | METHOD OF DISPLAYING CONTENT AND ELECTRONIC DEVICE FOR PROCESSING THE SAME - A method in an electronic device comprises when a display unit is deformed, determining a display area visible to a user, and displaying content on the confirmed display area. An apparatus for displaying content on an electronic device comprises a processing circuit configured to detect that a flexible display unit is bended, recognize at least a part of a user's body, determine which display area is visible to the user based on the user position; and move at least part of content on the visible display area. | 06-05-2014 |
20140152554 | PORTABLE DEVICE AND METHOD FOR CONTROLLING THE SAME - A method for controlling a portable device is provided. The method includes detecting bending of the portable device and determining whether to perform motion sensing correction due to the bending; acquiring a motion sensing correction factor for performing the motion sensing correction due to the bending; performing motion sensing correction of at least one motion sensor using the motion sensing correction factor; and controlling the portable device according to the corrected motion sensing. | 06-05-2014 |
20140152555 | PORTABLE DEVICE AND METHOD FOR CONTROLLING THE SAME - A method for controlling a portable device is provided. The method includes detecting bending of the portable device and determining whether to perform motion sensing correction due to the bending; acquiring a motion sensing correction factor for performing the motion sensing correction due to the bending; performing motion sensing correction of at least one motion sensor using the motion sensing correction factor; and controlling the portable device according to the corrected motion sensing. | 06-05-2014 |
20140152556 | STEREOSCOPIC IMAGE DISPLAY APPARATUS - To enable a viewer to recognize a stereoscopic image even when a posture of the viewer is changed in front of a display device by providing a stereoscopic image display apparatus including: a lens sheet which is adjacent to a display surface of a display device, and is configured by arranging a plurality of plano convex lenses along an arrangement direction of display elements, consecutively side by side while varying a distance from the display surface; a storage unit storing stereoscopic display images corresponding to parallax points with respect to a display object at a plurality of viewing points; a sensing unit which senses a posture changing amount of a visual organ of a human body; a selection unit selecting the stereoscopic display images for the respective viewing points according to the posture changing amount sensed; and a display controller allowing the stereoscopic display images on the display device. | 06-05-2014 |
20140152557 | GESTURE RECOGNITION DEVICE, ELECTRONIC APPARATUS, GESTURE RECOGNITION DEVICE CONTROL METHOD, CONTROL PROGRAM, AND RECORDING MEDIUM - A gesture recognition device that recognizes a gesture, which is of a motion or a shape of a gesture recognition target region of a person, from an image captured by a camera and outputs information to an electronic apparatus in order to control the electronic apparatus based on the recognized gesture, has a recognition target region determination section that determines whether the gesture recognition target region is included in the image, and an output section that outputs target region out-of-view angle notification instruction information issuing an instruction to make a notification that an image of the gesture recognition target region is not captured when the recognition target region determination section determines that the gesture recognition target region is not included in the image. | 06-05-2014 |
20140160001 | MIXED REALITY PRESENTATION - Embodiments that relate to presenting a mixed reality environment via a mixed reality display device are disclosed. For example, one disclosed embodiment provides a method for presenting a mixed reality environment via a head-mounted display device. The method includes using head pose data to generally identify one or more gross selectable targets within a sub-region of a spatial region occupied by the mixed reality environment. The method further includes specifically identifying a fine selectable target from among the gross selectable targets based on eye-tracking data. Gesture data is then used to identify a gesture, and an operation associated with the identified gesture is performed on the fine selectable target. | 06-12-2014 |
20140160002 | MOBILE DEVICE, SYSTEM AND METHOD FOR CONTROLLING A HEADS-UP DISPLAY - A mobile device, system and method for controlling a heads-up display device is provided. A mobile device is in communication with a heads-up display (HUD) device. The mobile device is enabled to: transmit a portion of display data to the HUD device for display thereupon, rather than provide the portion to a display of the mobile device; and display a remaining portion of the display data at the display. The HUD device is enabled to: receive from the mobile device the display data for display at the HUD; display the data at the HUD. | 06-12-2014 |
20140160003 | Accelerometer-Based Biometric Data - A user can provide input to a device in a 2-dimensional manner (such as by entering the data using a stylus or finger to apply pressure to a touchscreen) or in a 3-dimensional manner (such as by moving the device in 3-dimensional space). While the user input is being received, biometric data in the form of accelerometer data is collected. The accelerometer data indicates an amount of force applied in each of one or more directions over time. The collected accelerometer data thus provides an indication of the manner in which the device was moved, whether intentionally or unintentionally, while the user input was being received. The collected accelerometer data can be used in various manners, such as to verify the user input, as a signature of the user, as an authentication value to access a service or device, and so forth. | 06-12-2014 |
20140160004 | USE OF PHYSICIAN EYE TRACKING DURING A PROCEDURE - A method for displaying information, including presenting on one or more monitors a plurality of distinct elements in a first spatial relationship with one another, the plurality of distinct elements being illustrative of a state of a medical procedure. The method also includes measuring gaze directions, towards the plurality of the elements, of an implementer of the medical procedure while the procedure is being performed. In response to the measured gaze directions, the plurality of distinct elements are rearranged on the one or more monitors into a second spatial relationship with one another. | 06-12-2014 |
20140160005 | APPARATUS AND METHOD FOR CONTROLLING GAZE TRACKING - An apparatus and a method that controls gaze tracking. The apparatus includes an imaging device that has an image sensor divided into a plurality of zones and a gaze tracking controller. The gaze tracking controller is configured to determine a lens PSF and an image sensor PSF of the camera, respectively, using images for each zone acquired from image elements disposed in each zone of the image sensor. In addition, the gaze tracking controller is configured to estimate illumination reflection points using an imaging device PSF determined from the determined lens and image sensor PSFs at the time of detecting a user gaze using the imaging device. | 06-12-2014 |
20140160006 | PROJECTION SYSTEM AND METHOD FOR CONTROLLING THE SAME - A projection system includes an indicating device and a projection device, and the indicating device includes a button. When the button is pressed for the first time, the indicating device is configured to obtain a corresponding coordinate value and output a coordinate signal based on the coordinate value, and the projection device is configured to receive and process the coordinate signal to obtain and store the coordinate value. When the button is pressed for the second time in a first predetermined time, the projection device is configured to output a simulating double click signal to a computer. A method for controlling the projection system is also disclosed herein. | 06-12-2014 |
20140160007 | ELECTRONIC APPARATUS, METHOD OF CONTROLLING THE SAME, AND COMPUTER-READABLE RECORDING MEDIUM - An electronic apparatus includes a physical button to be manipulated by a predetermined manipulation, a sensor to measure ambient illuminance when an operation mode of the electronic apparatus is a power saving mode or a power off mode and the physical button is manipulated, and a controller to perform an operation corresponding to the predetermined manipulation of the physical button when it is determined that the ambient illuminance measured by the sensor is greater than or equal to a predetermined illuminance value. | 06-12-2014 |
20140160008 | OPTICAL INPUT DEVICE AND OPERATING METHOD THEREOF - An optical input device includes a light source module, a scan unit, a detection unit and a processing unit. The light source module generates a detection beam. The scan unit drives the detection beam to scan multiple frames in an input detection space. When scanning the frames, the detection unit detects a reflected detection beam and outputs a corresponding detection signal. The processing unit generates a process result corresponding to each of the frames according to the detection signal, generates an operation instruction according to the process results, and outputs the operation instruction to a peripheral device to execute the operation instruction. | 06-12-2014 |
20140160009 | AUDIO BOOK FOR PEN-BASED COMPUTER - A system for producing audio output from interaction with printed material and a pen based computer system. The system includes a printed page including a substantially invisible position code and a human viewable image. The system further includes a pen-based computer system for determining a position of the human viewable image in response to interactions there between, e.g., based upon the pen tip's proximity to the image. The pen-based computer is operable to produce a human-audible output corresponding to the human viewable image contemporaneously with the interaction. The human viewable image may be produced independent of the pen based computer and may include a textual word and/or an image. A plurality of pages may be provided in book form allowing a plurality of audio recordings to be rendered in connection with a plurality of images, thusly providing an interactive audio book experience for the user. | 06-12-2014 |
20140160010 | MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME - A mobile terminal including a wireless communication unit configured to wirelessly communicate with at least one other terminal; a body provided with a front surface, a rear surface and a lateral surface; a display unit disposed on at least the front surface and configured to display screen information; a squeeze sensing unit configured to sense a pressure applied to the body to detect a squeeze operation; and a controller configured to receive a control command for controlling a preset first operation of the mobile terminal, perform the first preset operation based on receiving only the control command and not detecting the squeeze operation, and perform a second preset operation different than the first preset operation based on receiving both the received control command and the detected squeeze operation. | 06-12-2014 |
20140160011 | APPARATUS AND METHOD FOR CHECKING GAZE OBJECT - An apparatus for checking a gaze object includes an eye detector configured to detect eyes from a captured image of a user who stares at objects while the objects displayed on a display move at certain frequencies, a frequency converter configured to check a motion of the detected eyes and convert the motion of the eyes into an eye moving frequency, and a discriminating unit configured to compare the eye moving frequency and moving frequencies of the objects and discriminate an object at which the user stares depending upon whether or not the eye moving frequency is identical with the moving frequencies of the objects. | 06-12-2014 |
20140160012 | AUTOMATIC CORRECTION DEVICE OF VEHICLE DISPLAY SYSTEM AND METHOD THEREOF - An automatic correction device of a vehicle display system and method thereof, comprising following steps: firstly, transform an image of road in front into projection information, to calculate a coordinate model, then detect facial features of a driver, to calculate a face rotation angle and a facial features 3-D position of said driver, to estimate a position field of view for said driver looking to front. Then, said position field of view is substituted into an image overlap projection transformation formula, to generate an overlap error correction parameter. Subsequently, utilize said overlap error correction parameter to correct and update said coordinate model. Finally, utilize a display unit to project said corrected coordinate model to front of said driver, so that projection position of said coordinate model overlaps said position field of view of said driver. | 06-12-2014 |
20140160013 | SWITCHING DEVICE - There is provided a switching device including an image sensor, a processing unit and a transmission unit. The image sensor successively captures image frames. The processing unit is configured to generate a control signal when sequentially recognizing an open hand gesture and a closed fist gesture or sequentially recognizing a closed fist gesture and an open hand gesture according to the image frames within a predetermined time interval. The transmission unit is configured to transmit the control signal to an electronic device. | 06-12-2014 |
20140160014 | DISPLAY SYSTEM CONTAINING AN ADAPTIVE SEMI-TRANSPARENT DISPLAY DEVICE AND MEANS FOR DETECTING THE LANDSCAPE VIEWED BY THE USER - The general field of the invention is that of display systems comprising an electronic calculator containing first image generation means and an associated semi-transparent display device, said display device intended to be arranged in front of an outside landscape. The display device according to the invention consists of two overlaid semi-transparent flat display screens, a photo-sensitive sensor and a position detection system. The first display screen is passive, its transmission rate being controlled by the first image generation means, the second display screen is active, its light emission being controlled by the first image generation means. The image generated is displayed either on the first display screen, or on the second display screen as a function of the outside landscape luminance information output by the photo-sensitive sensor and the position detection. | 06-12-2014 |
20140160015 | MANIPULATION INPUT DEVICE AND MANIPULATOR SYSTEM HAVING THE SAME - A manipulation input device includes a master grip having a grip part and manipulation handles movably supported on the grip part, a spring configured to generate manipulation resistance in response to a displacement amount of the manipulation handles when the manipulation handles are manipulated, and a force magnitude adjusting unit configured to adjust the force magnitude of the manipulation resistance relative to the displacement amount. | 06-12-2014 |
20140160016 | PORTABLE ELECTRONIC EQUIPMENT WITH AUTOMATIC CONTROL TO KEEP DISPLAY TURNED ON AND METHOD - Apparatus and method provide a hold on function in a mobile phone or other portable electronic equipment keeps the display thereof turned on in spite of a screensaver or other power saving feature while a user is reading or watching the display and allows activating of power saving facilities of the mobile phone or equipment only when a user no longer is reading or watching the display. Also, a computer program product stored in a storage medium, includes a storage medium, a computer program including face recognition software to recognize whether an input image represents that of a human face, and a control program to control operation of portable electronic equipment depending on whether or not an input image represents a human face. | 06-12-2014 |
20140168053 | TEST AND MEASUREMENT INSTRUMENT USER INTERFACE WITH MOVE MODE - An apparatus, system, and method are described for providing an intuitive user interface on a test and measurement instrument. The test and measurement instrument can include container logic, which provides a work mode in which interactions with objects within a container on a display are allowed, and a move mode in which interactions with the objects within the container on the display are temporarily prevented. When in the move mode, the container logic can detect a dragging gesture associated with the container. In response to the dragging gesture, a preview container arrangement is provided overlaying the container arrangement. The container logic can detect a dropping indication, thereby causing the arrangement to snap to the preview container arrangement. Various other user interface controls are provided while in the move mode. In multi-user environments, customized container arrangements may be saved and then later recalled. Containers may be moved among multiple different displays. | 06-19-2014 |
20140168054 | AUTOMATIC PAGE TURNING OF ELECTRONICALLY DISPLAYED CONTENT BASED ON CAPTURED EYE POSITION DATA - A method of controlling page turning operations for content displayed on a display element of an electronic device is presented here. The method begins by displaying a current page of content on the display element. The method continues by capturing eye position data that indicates position of an eye of a user of the electronic device, analyzing the captured eye position data to detect an eye-related condition corresponding to a page turning command, and executing the page turning command to display a new page of content. | 06-19-2014 |
20140168055 | Method and system for the display of virtual image edits - Embodiments for a method and system for the display of virtual image edits are disclosed. Source/linked image(s) ( | 06-19-2014 |
20140168056 | ENABLING AUGMENTED REALITY USING EYE GAZE TRACKING - Methods and apparatus relating to enabling augmented reality applications using eye gaze tracking are disclosed. An exemplary method according to the disclosure includes displaying an image to a user of a scene viewable by the user, receiving information indicative of an eye gaze of the user, determining an area of interest within the image based on the eye gaze information, determining an image segment based on the area of interest, initiating an object recognition process on the image segment, and displaying results of the object recognition process. | 06-19-2014 |
20140168057 | GYRO AIDED TAP GESTURE DETECTION - Methods, systems, computer-readable media, and apparatuses for tap detection in a mobile device are presented. In some embodiments, the method may comprise storing, by a mobile device, a first data sample from an accelerometer sensor and a second data sample from a gyroscope sensor. Additionally, the method may comprise processing a plurality of data samples. The plurality of data samples can include the first data sample or the second data sample. Optionally, in one embodiment, the method may comprise suppressing a tap that has been classified as a false detection based on at least one of the plurality of data samples. Subsequently, the method may comprise determining an occurrence of a tap at a mobile device based on the results of the processing. | 06-19-2014 |
20140168058 | APPARATUS AND METHOD FOR RECOGNIZING INSTRUCTION USING VOICE AND GESTURE - An apparatus and method that recognizes an instruction using voice and gesture to decrease time spent recognizing a sound model and a language model by recognizing the initial sound of each syllable of instructions using a gesture recognition technology and recognizing the voice for the instruction based on the recognized initial sound of each syllable. | 06-19-2014 |
20140168059 | METHOD AND SYSTEM FOR RECOGNIZING GESTURE - A method and system that recognize include capturing a hand image using an imaging device. In addition, a controller is configured to produce a template image which is symmetric with a primary captured hand image and match and compare each frame of the captured hand image to the template image in the shooting a hand image. Using the matching information, the controller is further configured to recognize a motion of the hand gesture. | 06-19-2014 |
20140168060 | WRISTWATCH AND METHOD OF LIGHTING SCREEN THEREOF - A wristwatch includes a screen, a backlight module to light the screen, a power source, a breakover module, a storage module, a gesture identifying module, and a processor. The breakover module connects to the power source and the backlight module and controls the supply of power from the power source to the backlight module. The storage module stores relationships between gestures and functions corresponding to the gestures. The functions include turning on the screen and turning off the screen. The gesture identifying module identifies a hand gesture of a user and determines the function of the hand gesture. The processor connects the breakover module, and controls the breakover module to break over according to the function of turning on the screen and controls the breakover module to break off according to the function of turning off the screen. A method of lighting a screen of a wristwatch is also provided. | 06-19-2014 |
20140168061 | SYSTEM AND METHOD FOR MANIPULATING USER INTERFACE IN VEHICLE USING FINGER VALLEYS - A method for manipulating a user interface within a vehicle using finger valleys includes receiving, by a controller, a captured image and detecting finger valleys from the captured image. In addition a hand gesture is recognized by the controller using the finger valleys and a vehicle equipment operation that corresponds to the recognized hand gesture is selected. Accordingly, the passenger is able to manipulate a vehicle steering wheel with one hand and look forward while operating many in-vehicle electronic devices with simple motions of the other hand, thereby improving passengers' convenience and driving safety. | 06-19-2014 |
20140168062 | SYSTEMS AND METHODS FOR TRIGGERING ACTIONS BASED ON TOUCH-FREE GESTURE DETECTION - Systems, methods and non-transitory computer-readable media for triggering actions based on touch-free gesture detection are disclosed. The disclosed systems may include at least one processor. A processor may be configured to receive image information from an image sensor, detect in the image information a gesture performed by a user, detect a location of the gesture in the image information, access information associated with at least one control boundary, the control boundary relating to a physical dimension of a device in a field of view of the user, or a physical dimension of a body of the user as perceived by the image sensor, and cause an action associated with the detected gesture, the detected gesture location, and a relationship between the detected gesture location and the control boundary. | 06-19-2014 |
20140168063 | INFORMATION DISPLAY DEVICE, INFORMATION DISPLAY SYSTEM, AND NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM - An information display device includes a device body, a display section that displays information, a motion detection sensor provided in the device body, an information acquiring section that acquires a plurality of types of information, an action detection section that detects a predetermined action of a user from output data of the sensor, and a display control section that selectively displays, in the display section, the plurality of types of information acquired by the information acquiring section in a set order every time the action detection section detects the predetermined action of the user. | 06-19-2014 |
20140168064 | SYSTEM AND METHOD FOR MANIPULATING USER INTERFACE BY 2D CAMERA - A method for manipulating a user interface by a 2D camera within a vehicle includes: receiving a captured image of a passenger and a captured image of the reflection of the passenger on a reflector; extracting real and virtual images of the passenger from the captured images, and recognizing a gesture by a correlation between the real and virtual images; and selecting a vehicle equipment operation corresponding to the recognized gesture. Accordingly, the passenger is able to manipulate the steering wheel with one hand and keep their eyes forward while controlling multiple in-vehicle electronic devices with simple motions with the other hand, thereby improving passengers' convenience and driving safety. | 06-19-2014 |
20140168065 | MOTION DETECTION SYSTEM - The present invention provides a motion detecting system, which includes a light source module, a plurality of image sensors and a control unit. The light source module illuminates at least one object. The image sensors respectively detect the object under the light emitted by the light source module to generate a plurality of detection results. The control unit is coupled to the image sensors, and generates a control command according to the detection results. | 06-19-2014 |
20140168066 | METHOD AND ELECTRONIC DEVICE FOR CONTROLLING DATA TRANSMISSION - A method and an electronic device for controlling data transmission are disclosed in the embodiment of the present invention. Data transmission may be controlled by detecting a first input operation and a second input operation; generating a first coordinate of an input point corresponding to the first input operation and a second coordinate of an input point corresponding to the second input operation; identifying the first coordinate and the second coordinate, and determining the coordinate corresponding to the first display region and the coordinate corresponding to the second display region; obtaining address information which corresponds to the coordinate corresponding to the first display region; determining file address information which corresponds to the coordinate corresponding to the second display region according to a correspondence between coordinate information and the file address information of the second electronic device; generating and sending a data transmission control instruction carrying the file address information. | 06-19-2014 |
20140168067 | ELECTRONIC DEVICE AND METHOD FOR CHARACTER INPUT - A method for character input using an electronic device includes recording movement displacements of the electronic device using a sensor when a character input function of the electronic device is invoked. The method determines movement coordinates based on a predetermined coordinate system according to the movement displacements. When the electronic device stops moving for a predetermined time duration, it is determined that one character has been inputted. A movement path is determined according to the movement coordinates and a recording sequence of the movement displacements, and a character corresponding to the movement path is generated and displayed. | 06-19-2014 |
20140168068 | SYSTEM AND METHOD FOR MANIPULATING USER INTERFACE USING WRIST ANGLE IN VEHICLE - A method and system of manipulating a user interface using a wrist angle that include receiving, by a controller, an image captured by an image photographing unit and detecting shapes of arms and hands of the passenger from the captured image to calculate the wrist angle. In addition, the method includes recognizing, by the controller, wrist gesture information that corresponds to a change in the calculated wrist angle and selecting a vehicle device manipulation that corresponds to the recognized wrist gesture information. | 06-19-2014 |
20140168069 | ELECTRONIC DEVICE AND LIGHT PAINTING METHOD FOR CHARACTER INPUT - A light painting method for character input using an electronic device includes adjusting an image capturing device according to preset capturing parameters. The image capture device is enabled to capture a light painting image of a lighting device at each predetermined time interval. The light painting image is converted to a reflection image when the light painting image is a mirror image. The method recognizes a movement path of light in the reflection image, and generates a character corresponding to the movement path of the light. | 06-19-2014 |
20140168070 | METHOD AND APPARATUS FOR DISPLAYING DATA - According to various embodiments, an apparatus is provided, including: a first display and a second display and a control circuit operatively coupled to the first display and the second display. The control unit is configured to perform operations including determining a relative position between the first display and the second display, and displaying information for presenting to a user via at least one of the first display or the second display, based on the relative position. Other embodiments are possible. | 06-19-2014 |
20140168071 | Mobile Presence Detection - Novel tools and techniques for collecting and using presence information. In accordance with some techniques, a presence detection device (“PDD”) at a customer premises and/or another computer (such as a control server) can identify and/or authenticate a user. Once identified and/or authenticated, the user's profiles and/or media content may be sent to the PDD, and/or access to the user's profiles and/or media content may be provided to the PDD. | 06-19-2014 |
20140168072 | PROCESSING METHOD AND ELECTRONIC DEVICE - A processing method and an electronic device are provided. The processing method includes: acquiring a first image which contains an instruction object by the image acquisition unit, wherein the instruction object is directional; analyzing the first image and acquiring first parameter information represented by the instruction object, where the first parameter information indicates a direction of the instruction object; generating a first instruction based on the first parameter information; and executing, in response to the first instruction, an operation corresponding to the instruction object. | 06-19-2014 |
20140168073 | Methods and Systems for Haptic Rendering and Creating Virtual Fixtures from Point Clouds - Methods, articles of manufacture, and devices related to generating haptic feedback for point clouds are provided. A computing device receives depth data about an environment. The computing device generates a point cloud from the depth data. The computing device determines a haptic interface point (HIP). The computing device determines a haptic interface point (HIP). The computing device determines a force vector between the HIP and point cloud. The computing device sends an indication of haptic feedback based on the force vector. | 06-19-2014 |
20140168074 | METHOD AND TERMINAL DEVICE FOR CONTROLLING CONTENT BY SENSING HEAD GESTURE AND HAND GESTURE, AND COMPUTER-READABLE RECORDING MEDIUM - The present invention relates to a method, a terminal, and a computer-readable medium for controlling content by detecting head and hand gestures. The method for controlling a content by detecting hand gestures and head gestures, includes steps of: (a) detecting a head region and a hand region of a user by analyzing an input image with an object detection technology; (b) tracking the gestures in the head region and the hand region of the user by using a computer vision technology; and (c) allowing the content to be controlled by referring to gestures-combining information including data on the tracked gestures in the head region and the hand region. | 06-19-2014 |
20140168075 | Method to Control Perspective for a Camera-Controlled Computer - Systems, methods and computer readable media are disclosed for controlling perspective of a camera-controlled computer. A capture device captures user gestures and sends corresponding data to a recognizer engine. The recognizer engine analyzes the data with a plurality of filters, each filter corresponding to a gesture. Based on the output of those filters, a perspective control is determined, and a display device displays a new perspective corresponding to the perspective control. | 06-19-2014 |
20140176415 | DYNAMICALLY GENERATING HAPTIC EFFECTS FROM AUDIO DATA - Haptic effects are dynamically generated for content presentation on a device through analysis of the content. During content playback, audio data for the content may be analyzed to determine low frequency audio data. The low frequency audio data is mapped from a low frequency range to a haptic control frequency range of one or more haptic actuators included in the device. This mapping may be used to generate a control signal to drive the one or more haptic actuators. The haptic effects and the content may be synchronized to one another during the presentation of the content on the device. The haptic actuator control signal may be amplified proportionally to the amplitude of the low frequency audio data. | 06-26-2014 |
20140176416 | CAPTURING PHOTOS WITHOUT A CAMERA - Implementations generally relate to capturing photos without a camera. In some implementations, a method includes determining a gaze of a user using a device that is operable to track the gaze. The method also includes receiving an indication from the user that the user desires a photo corresponding to the gaze. The method also includes providing the photo to the user based on the indication. | 06-26-2014 |
20140176417 | WEARABLE PROJECTOR FOR PORTABLE DISPLAY - Described herein are technologies related to a wearable projector to project images, information, multimedia, etc. in a portable display. More particularly, the wearable projector includes a system on chip (SOC) microprocessor that is configured to project the images, information, multimedia, etc. to different types of portable display such as, flexible transparent plastic, glass, paper, and the like. | 06-26-2014 |
20140176418 | DISPLAY OF SEPARATE COMPUTER VISION BASED POSE AND INERTIAL SENSOR BASED POSE - A mobile device determines a vision based pose using images captured by a camera and determines a sensor based pose using data from inertial sensors, such as accelerometers and gyroscopes. The vision based pose and sensor based pose are used separately in a visualization application, which displays separate graphics for the different poses. For example, the visualization application may be used to calibrate the inertial sensors, where the visualization application displays a graphic based on the vision based pose and a graphic based on the sensor based pose and prompts a user to move the mobile device in a specific direction with the displayed graphics to accelerate convergence of the calibration of the inertial sensors. Alternatively, the visualization application may be a motion based game or a photography application that displays separate graphics using the vision based pose and the sensor based pose. | 06-26-2014 |
20140176419 | METHOD AND APPARATUS FOR SHARING CONTENT - A method, apparatus, and computer program product are provided to facilitate sharing of content between various computing devices. In the context of a method, an image of an external apparatus is received that includes information presented upon a display of the external apparatus. The method may also cause a content request to be provided that requests content upon which the information presented upon the display of the external apparatus is based. In response to the request, the method may receive the content upon which the information presented upon the display of the external apparatus is based. | 06-26-2014 |
20140176420 | Laser Beam Based Gesture Control Interface for Mobile Devices - A method of receiving control instructions for a device, comprising receiving two or more video frames, analyzing the video frames for at least one foreground image and at least one background image, evaluating each video frame for a two dimensional aspect comprising a first area having a relatively higher channel intensity value in one or more channels than the channel intensity values of a second area, recording coordinates associated with the aspect in each of the video frames, evaluating the coordinates to determine a motion trajectory, and matching the motion trajectory to a prerecorded motion trajectory in a matching table, wherein the matching table contains associations of motion trajectories and control instructions, and obtaining the control instruction associated with the prerecorded motion trajectory. | 06-26-2014 |
20140176421 | DISPLAYING METHOD FOR FLEXIBLE DISPLAY DEVICE AND FLEXIBLE DISPLAY DEVICE USING THE SAME - A displaying method of a flexible display device and the flexible display device using the method are provided. The displaying method includes: displaying at least one object on a displaying area of the flexible display device; detecting a display mode of the flexible display device and generating display mode information; determining if at least one second object in the at least one object is displayed on a separation line according to the display mode information; rearranging the at least one object in the displaying area according to the display mode information if the at least one second object is displayed on the separation line. Accordingly, the arranging of the at least one object fits the need of users. | 06-26-2014 |
20140176422 | BIOMETRIC MONITORING DEVICE WITH WRIST-MOTION TRIGGERED DISPLAY - A biometric monitoring device with a display is provided. The display may, in response to receiving page advance requests from a user, advance through a plurality of different data display pages, at least some of which show aspects of biometric data recorded by the device. The biometric monitoring device may also, based on the biometric data, modify the sequential display order of the data display pages. In some implementations, a biometric monitoring device integrated into a wristband may be configured to turn a display of the biometric monitoring device on and display the time in response to biometric sensors of the biometric monitoring device detecting motion of the wearer's forearm consistent with moving the forearm into a watch-viewing position. | 06-26-2014 |
20140176423 | SEAT LAYOUT DISPLAY APPARATUS, SEAT LAYOUT DISPLAY METHOD, AND PROGRAM THEREOF - A display apparatus includes: a location acquisition unit that acquires an image location designated by an operation of the user in the planar image of the user symbol; a direction setting unit that specifies an area to which the image location of the user symbol belongs from among a plurality of areas into which the planar image is divided and with which a predetermined direction is associated in advance, respectively, and sets, as a direction of eyes of the user, a direction that is associated in advance with the area specified; and a display control unit that displays the planar image in which the seat symbols and the user symbol are arranged in such a way of being associated with a seat layout in a room visually recognized when the user views in the direction of eyes. | 06-26-2014 |
20140176424 | ELECTRONIC DEVICE AND METHOD FOR ADJUSTTING DISPLAY SCREEN - In a method for adjusting a display screen of an electronic device, the method obtains a current image and a previous image of a user captured by an image capturing device, detects a first face area from the current image and a second face area from the previous image. The method further rotates the display screen according to the movements of the user's face, and stops rotating the display screen when rotation angles of the display screen are equal to movement angles of the user's face. | 06-26-2014 |
20140176425 | SYSTEM AND METHOD FOR IDENTIFYING POSITION OF HEAD-UP DISPLAY AREA - A system and method for identifying the position of a head-up display (HUD) area are provided. The system in which a HUD image is displayed on a front glass of a vehicle includes a controller that is configured to determine a direction in which the HUD image moves in response to an input signal input by a driver. In addition identification information used to identify the position of the HUD area is processed according to the movement of the HUD image and the identification information is displayed on a display unit. | 06-26-2014 |
20140176426 | DISPLAY TERMINAL APPARATUS, INFORMATION DISPLAY SYSTEM, INFORMATION DISPLAY CONTROL METHOD AND STORAGE MEDIUM STORING PROGRAM THEREOF - Disclosed is a display terminal apparatus including a wireless communication unit for performing wireless communication with a set external device, a display unit, a display information obtaining unit which obtains display information including a display content to be displayed in the display unit, a display period setting of the display content and a changing condition for changing the display period through the wireless communication unit, a display control unit which controls the display period of the display content in the display unit on the basis of the display period setting, a determination unit which determines whether a state corresponds to the changing condition, and a display period changing unit which changes the display period on the basis of a determination result obtained in the determination unit. | 06-26-2014 |
20140176427 | ELECTRONIC DEVICE AND METHOD FOR ADJUSTING DISPLAY SCREEN - In a method for adjusting a display screen of an electronic device, the method obtains a current image of a user captured by an image capturing device installed on the display screen, detects a gesture of the user from the current image, and determines a gesture template matched with the detected gesture. The method further rotates the display screen according to a preset operation corresponding to the matched gesture template, and stops rotating the display screen when a preset gesture for stopping rotating the display screen is detected from the current image. | 06-26-2014 |
20140176428 | FLEXIBLE ELECTRONIC DEVICE AND METHOD FOR CONTROLLING FLEXIBLE ELECTRONIC DEVICE - In a method for controlling a data page displayed on a flexible display of a flexible electronic device, a plurality of gravity sensors located at each side of the flexible electronic device detects current position data of corresponding sides of the flexible electronic device. A curved angle and a curved direction of each side of the flexible electronic device is calculated based on the current position and a preset position of each side. An operation on the flexible electronic device is determined according to the curved angle and the curved direction of each side of the flexible electronic device. An instruction associated with each operation is executed to control movement of the data page displayed on the flexible display. | 06-26-2014 |
20140176429 | FLEXIBLE ELECTRONIC DEVICE AND METHOD FOR CONTROLLING FLEXIBLE ELECTRONIC DEVICE - In a method for controlling a flexible electronic device to sleep or restart, current position data of two gravity sensors located at opposite sides of a flexible display of the flexible electronic device is received. A distance between the two gravity sensors is detected by a proximity sensor located at one of the opposite sides of the flexible display. A control unit of the flexible electronic device controls the flexible electronic device to sleep or restart according to the current position data and the distance. | 06-26-2014 |
20140176430 | INTELLIGENT SWITCHING SYSTEM, ELECTRONIC DEVICE THEREOF, AND METHOD THEREOF - A method for intelligent switching is applied to an electronic device. The electronic device includes a camera, a storage unit, a processor, and a wireless communication unit. The camera captures a picture when each time a predetermined time interval elapses. The storage unit stores a predetermined viewing range of values of viewing angles of the eyes of a user relating to the user viewing a display screen of the electronic device. The method includes, first, constructing a picture according to the input signals. Second, a connection signal is generated when the constructed picture is determined to contain a facial image of the user and the viewing angle of eyes of the user in the facial image falls within the predetermined range of values. Lastly, a connection is established between the electronic device and the wireless device. An electronic device and a system for intelligent switching are also provided. | 06-26-2014 |
20140176431 | REPRODUCER, DIGITAL CAMERA, SLIDE SHOW REPRODUCTION METHOD, PROGRAM, IMAGE DISPLAY APPARATUS, IMAGE DISPLAY METHOD, IMAGE REPRODUCTION METHOD, AND IMAGE DISPLAY PROGRAM - A reproducer is provided with members shown below. Namely, the reproducer is provided with a storage part for storing images, a vibration detection part for detecting a vibration operation by a user, a switching instruction part for instructing switching of reproduction content in slide show reproduction based on a detection result from the vibration detection part, a synthesis processing part for performing creation processing of slide show images, which are images for the slide show reproduction, based on instructions of the switching instruction part, and a slide show control part for performing the slide show reproduction of the images stored in the storage part and the slide show images. | 06-26-2014 |
20140176432 | APPARATUS AND METHOD FOR PROVIDING TACTILE SENSATION IN COOPERATION WITH DISPLAY DEVICE - An apparatus and a method for providing a tactile sensation in cooperation with a display device are disclosed. The disclosed apparatus includes: a display-device communication unit configured to receive information on a virtual image from the display device; a determiner unit configured to determine whether or not a user attempts an interaction with the virtual image by using an image of a user body; and a tactile-sensation provider unit configured to provide a tactile sensation to a preset body part of the user based on a determination result of the determiner unit. The disclosed apparatus can enable the user to interact with a virtual image with verisimilitude. | 06-26-2014 |
20140184486 | MOBILE TERMINAL AND CONTROLLING METHOD THEREOF - There are disclosed a mobile terminal and a controlling method thereof, which is able to simply input a real-time external image to an input field as data, specifically, such that a user may input a character or picture captured from the real-time image inputted from the outside via the camera and that the real time image maybe captured without switching a screen displaying the input field, so as to allow a user to intuitively recognize which input field an object provided in the image is inputted to. | 07-03-2014 |
20140184487 | MANIPULATING VIRTUAL OBJECTS DISPLAYED BY A DISPLAY DEVICE THROUGH A PORTABLE DEVICE - A virtual object in a virtual scene displayed on a display device disposed in a structure is capable of being manipulated by a portable device including a wireless communication unit, an image sensing unit, a display unit, and an input unit. The image sensing unit produces snapshot information corresponding to the virtual scene. The display unit displays snapshot image(s) according to the snapshot information, and displays manipulation option(s) according to manipulation item information of the virtual object provided by the display device. The input unit produces selection parameter(s) in response to a selection operation corresponding to the snapshot image(s), and produces manipulation parameter(s) in response to user input with respect to the manipulation option(s). The wireless communication unit transmits the manipulation parameter(s) to the display device to enable the display device to manipulate the virtual object according to the manipulation parameter(s). | 07-03-2014 |
20140184488 | APPARATUS AND METHOD FOR INTERLOCKING DISPLAY - A display interlocking system and method that control a screen output to a display via interlocking between an interlocking apparatus and a display control apparatus. The interlocking apparatus forms an output screen and transmits the output screen to the display control apparatus. In addition, the interlocking apparatus inserts a code corresponding to an event into the output screen when the event is generated to the display control apparatus and transmits the output screen to the display control apparatus. The display control apparatus operates the output screen to be output on the display connected thereto or to be interrupted based on a driver's attention dispersion state when the code is inserted into the output screen. | 07-03-2014 |
20140184489 | OPEN ANGLE DETECTION AND PROCESSING APPARATUS AND METHOD - Embodiments of techniques and apparatus for open angle detection and processing are described. In embodiments, an apparatus may comprise a first panel having a display, a second panel movably coupled with the first panel. The first and second panels variably define an angle between these two panels. One or more sensors may be disposed in the apparatus and configured to detect an angle change event of the variable angle between two panels. The angle changing information may be used by an application to vary output of the application onto the display. Other embodiments may be described and claimed. | 07-03-2014 |
20140184490 | METHOD FOR INTERACTING WITH FLEXIBLE DEVICE AND USER TERMINAL THEREOF - A method for interacting with a flexible device and a user terminal thereof includes detecting a physical transformation of the flexible device, and performing an interaction, in association with the flexible device, corresponding to the physical transformation. | 07-03-2014 |
20140184491 | SYSTEM AND METHOD FOR PROVIDING USER INTERFACE USING AN OPTICAL SCANNING - A system provides a user interface using an optical scan and the system includes a scan light and an optical sensor that detects whether the light radiated to an object in a vehicle from the scan light is dispersed. A processor controls the scan light to radiate light for a scan to a predetermined position at a predetermined time, estimates the position of dispersed light, and outputs a corresponding signal, when the optical sensor detects dispersion of the light, by comparing the detection time of the light, and the predetermined time and the radiation position of the scan light. The processor recognizes the shape or the motion of the object in the vehicle based on the signal and outputs a corresponding signal and operates the devices in the vehicle based on the signal. | 07-03-2014 |
20140184492 | FEATURE CALCULATION DEVICE, FEATURE CALCULATION METHOD, AND COMPUTER PROGRAM PRODUCT - According to an embodiment, a feature calculation device includes an obtaining unit, a first calculator, and a second calculator. The obtaining unit is configured to obtain a point sequence group comprising a set of point sequences in which a sequence of a plurality of points is fixed. The first calculator is configured to, for each of the point sequences in the point sequence group, calculate a plurality of values related to a curvature of a shape of the point sequence, with a reference point of the point sequence as a reference, and calculate feature quantities based on the plurality of calculated values. The second calculator is configured to calculate a histogram representing a distribution of the feature quantities of each of the point sequences. | 07-03-2014 |
20140184493 | ELECTRONIC DEVICE AND GESTURE CONTOL METHOD FOR ELECTRONIC DEVICE - A control method is applied to an electronic device to control the electronic device by a user's gestures in real time. When the user's gestures change from a first predetermined gesture to a second predetermined gesture, the control method controls the electronic device to perform a corresponding function. | 07-03-2014 |
20140184494 | User Centric Interface for Interaction with Visual Display that Recognizes User Intentions - Systems, methods, means and computer program products for identifying graphical objects are disclosed. Certain systems, methods, means and computer program products may identify a graphical object based on geometric relationships between two or more user body parts and the graphical object. Certain systems, methods, means and computer program products identify a graphical object based on analysis of pictorial images depicting positions of user body parts relative to each other. | 07-03-2014 |
20140184495 | Portable Device Input by Configurable Patterns of Motion - A user-defined motion data pattern is used as input to a handheld computing device. The user-defined motion data pattern can be selectively associated with a corresponding input to the handheld computing device. The input may correspond to a particular command to perform a selected function that is executable by the portable computer device. The handheld computer device can acknowledge successful input of a motion and/or motion data pattern by providing tactile, audible and/or visual feedback. | 07-03-2014 |
20140184496 | EXTRAMISSIVE SPATIAL IMAGING DIGITAL EYE GLASS APPARATUSES, METHODS AND SYSTEMS FOR VIRTUAL OR AUGMEDIATED VISION, MANIPULATION, CREATION, OR INTERACTION WITH OBJECTS, MATERIALS, OR OTHER ENTITIES - A sensing and display apparatus, comprising: a first phenomenon interface configured to operatively interface with a first augmediated-reality space, and a second phenomenon interface configured to operatively interface with a second augmediated-reality space, is implemented as an extramissive spatial imaging digital eye glass. | 07-03-2014 |
20140184497 | INTERACTIVITY MODEL FOR SHARED FEEDBACK ON MOBILE DEVICES - A system that produces a dynamic haptic effect and generates a drive signal that includes a gesture signal and a real or virtual device sensor signal. The haptic effect is modified dynamically based on both the gesture signal and the real or virtual device sensor signal such as from an accelerometer or gyroscope, or by a signal created from processing data such as still images, video or sound. The haptic effect may optionally be modified dynamically by using the gesture signal and the real or virtual device sensor signal and a physical model, or may optionally be applied concurrently to multiple devices which are connected via a communication link. The haptic effect may optionally be encoded into a data file on a first device. The data file is then communicated to a second device and the haptic effect is read from the data file and applied to the second device. | 07-03-2014 |
20140184498 | METHOD AND APPARATUS FOR MOTION RECOGNITION - A motion recognizing apparatus and method are provided. According to an aspect, a motion recognizing apparatus may include: an optical sensor configured to sense at least a portion of a subject where a motion occurs and to output one or more events in response thereto; a motion tracing unit configured to trace a motion locus of the portion where the motion occurs based on the one or more outputted events; and a motion pattern determining unit configured to determine a motion pattern of the portion where the motion occurs based on the traced motion locus. | 07-03-2014 |
20140184499 | REMOTE MANIPULATION DEVICE AND METHOD USING A VIRTUAL TOUCH OF A THREE-DIMENSIONALLY MODELED ELECTRONIC DEVICE - Disclosed is a remote control apparatus and method using a virtual touch of a three-dimensionally modeled electronic device. The remote control apparatus includes a 3D coordinate calculation unit, an electronic device detecting unit, a user detecting unit, an electronic device driving control unit, and an electronic device information database. The remote control apparatus enables the user to remotely control an operation of an electronic device using a virtual touch through a motion or gesture of the user's finger. | 07-03-2014 |
20140191938 | VIRTUAL SENSOR SYSTEMS AND METHODS - Methods and systems for detecting activation of a virtual sensor in a scene are proposed, wherein the virtual sensor comprises data representing a geometric form, data representing a position in the scene and data representing one or several trigger conditions. A data representation of the scene is captured, and a determination of whether a virtual sensor trigger condition is fulfilled is made, based on an analysis of the captured data representation in an area corresponding to the geometric form and position of the virtual sensor. | 07-10-2014 |
20140191939 | USING NONVERBAL COMMUNICATION IN DETERMINING ACTIONS - Nonverbal communication is used when determining an action to perform in response to received user input. The received input includes direct input (e.g. speech, text, gestures) and indirect input (e.g. nonverbal communication). The nonverbal communication includes cues such as body language, facial expressions, breathing rate, heart rate, well as vocal cues (e.g. prosodic and acoustic cues) and the like. Different nonverbal communication cues are monitored such that performed actions are personalized. A direct input specifying an action to perform (e.g. “perform action 1”) may be adjusted based on one or more indirect inputs (e.g. nonverbal cues) received. Another action may also be performed in response to the indirect inputs. A profile may be associated with the user such that the responses provided by the system are determined using nonverbal cues that are associated with the user. | 07-10-2014 |
20140191940 | VEHICLE DISPLAY ARRANGEMENT AND VEHICLE COMPRISING A VEHICLE DISPLAY ARRANGEMENT - A display arrangement for a vehicle is provided. The arrangement may include a display for displaying information to a vehicle operator, and a sensor for monitoring a head position of a vehicle operator and generating input signals indicative of the vehicle operator head position. The arrangement may also include a control unit arranged to automatically initiate, based on an interpretation of the input signals, at least one of a dimension change of graphical display objects or a change of the relation between the graphical display objects following a monitored predefined change of the vehicle operator head position in a first direction, such that graphical display objects previously displayed as partly hidden are arranged to be displayed as visible. | 07-10-2014 |
20140191941 | PORTABLE USER INPUT SYSTEM OPERABLE IN TWO MODES - A portable user interface device receives user inputs in two modes. The user interface device includes a central portion attached to two side portions. Relative rotation between the central portion and the side portions converts the user interface device between the two modes. In the first mode, the user interface device is suspended on the front of a user's torso. In the second mode, the user interface device may be placed on a substantially flat horizontal surface. | 07-10-2014 |
20140191942 | DISPLAY DEVICE AND CONTROL METHOD THEREOF - A head-mounted display device includes: a display section that outputs an image light to overlap with an outside light and causes an image to be viewed; and an input detecting section that detects an input operation in a case where a predetermined light is determined in a range corresponding to the image. | 07-10-2014 |
20140191943 | ELECTRONIC APPARATUS AND METHOD FOR CONTROLLING ELECTRONIC APPARATUS THEREOF - An electronic apparatus is provided. The electronic apparatus includes a motion input unit configured to receive a user motion, a display configured to display an object controlled by the user motion received by the motion input unit, and a controller configured to change a display state of the object in response to the input user motion satisfying a predetermined condition with respect to a motion recognition scope. | 07-10-2014 |
20140191944 | LIVING BODY INFORMATION DETECTION APPARATUS AND LIVING BODY INFORMATION DETECTION PROGRAM - To provide a living body information detection apparatus and a living body information detection program capable of allowing a user to easily recognize information based on acquired living body information. A living body information detection apparatus includes an acquisition unit acquiring living body information obtained by detecting a living body signal and a control unit moving a display position of an image associated with the living body information on a display unit based on the magnitude of a value of the living body information obtained by the acquisition unit. | 07-10-2014 |
20140191945 | ELECTRONIC DEVICE AND METHOD FOR ADJUSTING DISPLAY SCREEN - In a method for adjusting a display screen of an electronic device, the method receives analog audio signals of a user, transforms the analog audio signals to digital audio signals, detects a first control command including first rotation directions and first rotation angles of the display screen from the digital audio signals, and rotates the display screen according to the first rotation directions and the first rotation angles. The method further obtains second rotation angles of the display screen detected by a gravity sensor of the electronic device, and stops rotating the display screen when the second rotation angles are equal to the first rotation angles. | 07-10-2014 |
20140191946 | HEAD MOUNTED DISPLAY PROVIDING EYE GAZE CALIBRATION AND CONTROL METHOD THEREOF - A control method of a head mounted display (HMD) is disclosed. The control method of the HMD includes detecting a first route of a first moving object and a second route of a second moving object in front of the HMD, detecting a third route along which user's eye gaze moves, setting the first moving object as a reference object if the detected first route is substantially identical with the detected third route, setting the second moving object as the reference object if the detected second route is substantially identical with the detected third route, and performing an eye gaze calibration based on a route of the set reference object and the detected third route. | 07-10-2014 |
20140191947 | Using Natural Movements of a Hand-Held Device to Manipulate Digital Content - A mobile device, such as a smart phone, is provided with a camera. Digital content displayed on display screen of the mobile device may be manipulated in response to natural movements of the mobile device by a user. Motion of the mobile device is detected relative to a nearby textured surface by analyzing images of the textured surface. The displayed digital content is manipulated in response to the detected motion of the mobile device. | 07-10-2014 |
20140191948 | APPARATUS AND METHOD FOR PROVIDING CONTROL SERVICE USING HEAD TRACKING TECHNOLOGY IN ELECTRONIC DEVICE - An apparatus and a method for controlling an electronic device using head tracking technology are provided. The method includes detecting face motion from an image captured through a camera of the electronic device and executing a control service corresponding to the detected face motion information. | 07-10-2014 |
20140191949 | DISPLAY APPARATUS AND METHOD OF CONTROLLING A DISPLAYAPPARATUS IN A VOICE RECOGNITION SYSTEM - A display method and apparatus for controlling a voice recognition system are provided. When a user's voice for controlling the display apparatus is input, the method of controlling the display apparatus transmits the voice's user to the interactive server, while determining whether the user's voice is a pre-stored command in the display apparatus, and in response to the user's voice not being a pre-stored command in the display apparatus and the control information which corresponds to the user's voice and first guide information guiding the pre-stored command capable of the same function as the user's voice are transmitted from the interactive server, the method performs the function of the display apparatus, according to the control information transmitted from the interactive server, and displays the first guide information. | 07-10-2014 |
20140191950 | DISPLAY DEVICE FOR NETWORK PAIRING AND NETWORK PAIRING METHOD - A flexible display device for network pairing and a network pairing method are provided. The flexible display device includes an input sensor configured to sense two or more input events generated by an input means, and a controller configured to analyze device information about two or more flexible display devices in response to the sensed input events and, if determining that the sensed input events are for network pairing, to perform network pairing between the two or more flexible display devices. | 07-10-2014 |
20140191951 | Image-Based Object Tracking System and Image-Based Object Tracking Method - An image-based object tracking system and an image-based object tracking method are provided. The image-based object tracking system includes an object, a camera, a computing device, and a display device. A color stripe is disposed on the surface of the object. The color stripe divides the surface of the object into a first section and a second section. The camera is configured to capture real-time images of the object. Further, the object tracking algorithm is stored in the computing device. The display device includes a display screen and is electrically connected to the computing device, and the display screen is configured to display the real-time images of the object. By using the image-based object tracking system provided in the invention, the object can be tracked accurately and efficiently without interference with the image of the background. | 07-10-2014 |
20140191952 | ELECTRONIC DEVICE WITH WRAP AROUND DISPLAY - A consumer electronic product includes at least a transparent housing and a flexible display assembly enclosed within the transparent housing. In the described embodiment, the flexible display assembly is configured to present visual content at any portion of the transparent housing. | 07-10-2014 |
20140191953 | INFORMATION PROCESSING APPARATUS, CONTROL METHOD FOR USE THEREIN, AND COMPUTER PROGRAM - Disclosed herein is an information processing apparatus including: operation accepting means for accept an operation input; image storing means for storing a plurality of images; management information storing means for storing management information corresponding to the plurality of images stored in the image storing means; image drawing means for drawing, in a display area including at least an image display area, at least one of images stored in the image storing means onto the image display area in a predetermined sequence; and controlling means for, when an operation input commanding selection of an image included in the image display area has been accepted by the operation accepting means, controlling recording of the selection of the image to management information stored in the management information storing means in correspondence with the image. | 07-10-2014 |
20140191954 | Gesture Based User Interface Supporting Preexisting Symbols - A motion controlled handheld device includes a display having a viewable surface and operable to generate an image and a gesture database maintaining a plurality of gestures. Each gesture is defined by a motion of the device with respect to a first position of the device. The gestures comprise symbol gestures each corresponding to a character from a preexisting character set. The device includes an application database maintaining at least one application and a gesture mapping database comprising a gesture input map for the application. The gesture input map comprises mappings of the symbol gestures to corresponding inputs for the application. The device includes a motion detection module operable to detect motion of the handheld device within three dimensions and to identify components of the motion in relation to the viewable surface. The device also includes a control module operable to load the application, to track movement of the handheld device using the motion detection module, to compare the tracked movement against the symbol gestures to identify a matching symbol gesture, to identify, using the gesture input map, the corresponding input mapped to the matching symbol gesture, and to provide the corresponding input to the application. | 07-10-2014 |
20140191955 | EFFICIENT GESTURE PROCESSING - Embodiments of the invention describe a system to efficiently execute gesture recognition algorithms. Embodiments of the invention describe a power efficient staged gesture recognition pipeline including multimodal interaction detection, context based optimized recognition, and context based optimized training and continuous learning. Embodiments of the invention further describe a system to accommodate many types of algorithms depending on the type of gesture that is needed in any particular situation. Examples of recognition algorithms include but are not limited to, HMM for complex dynamic gestures (e.g. write a number in the air), Decision Trees (DT) for static poses, peak detection for coarse shake/whack gestures or inertial methods (INS) for pitch/roll detection. | 07-10-2014 |
20140191956 | FLEXIBLE DISPLAY SCREEN AND METHOD AND APPARATUS FOR CONTROLLING THE SAME - A method for controlling a flexible display screen having a first curvature is provided. The method comprises: detecting target information to obtain a detection result ( | 07-10-2014 |
20140198024 | SYSTEM AND METHOD FOR DETECTING THREE DIMENSIONAL GESTURES TO INITIATE AND COMPLETE THE TRANSFER OF APPLICATION DATA BETWEEN NETWORKED DEVICES - An apparatus and method for detecting a three-dimensional gesture are provided. The method includes detecting, by at least one three dimensional motion sensing input device embedded in a network having a plurality of interconnected hardware, a three-dimensional gesture of a user, selecting, based on the detected gesture, application data corresponding to an application being executed, stored or displayed on a first device in the network to be transmitted to a second device in the network, transmitting the selected application data to hardware and software associated with the second device, and performing at least one of executing, storing or displaying the selected application data on the second device, wherein the at least one three dimensional motion sensing input device comprises gesture detection hardware and software. | 07-17-2014 |
20140198025 | SYSTEM AND METHOD FOR DISCERNING COMPLEX GESTURES USING AN ARRAY OF OPTICAL SENSORS - A method for gesture determination (e.g., discerning complex gestures) via an electronic system (e.g., a gesture sensing system) including an array of optical sensors is described herein. The method includes detecting a plurality of sub-gestures (e.g., simple gestures provided by a target located proximate to the system) via the array of optical sensors. The sensors generate signals based upon the detected (e.g., received) sub-gestures and transmit the signals to a processor of the gesture sensing system. The processor processes the signals to obtain data associated with the sub-gestures and analyzes the sub-gesture data to determine if the sub-gestures collectively constitute a gesture (e.g., complex gesture). When the analyzing indicates that the sub-gestures collectively constitute a complex gesture, the gesture sensing system detects the complex gesture. | 07-17-2014 |
20140198026 | DEVICE FOR TRANSMITTING AND RECEIVING DATA USING EARPHONE AND METHOD FOR CONTROLLING THE SAME - According to an embodiment, a method for controlling a portable device connected to an earphone having a coil includes detecting removal of one of left and right units of the earphone worn by a user, generating an electromagnetic pattern corresponding to an Identifier(ID) of the portable device at the removed unit of the earphone, and receiving data from the external device in correspondence with the ID of the portable device. According to an embodiment, a method for controlling a display device includes displaying a Graphic User Interface on a display unit, sensing an earphone within a detection area of the display unit, sensing an electromagnetic pattern of the earphone within a detection area of the display unit, acquiring an ID of an external device connected to the earphone from the sensed electromagnetic pattern, and transmitting data to the external device identified by the acquired ID of the external device. | 07-17-2014 |
20140198027 | Media Distribution System - Systems, devices, and methods for delivering and managing media whereby a first media element contains multiple media components and a combination of user activity and time are necessary to unlock a subset of the multiple media components. In one embodiment, the user activities include serving as a peer leader, purchasing a key that unlocks at least one of the multiple media components, and other activities having value to the system. The system may also update the media components individually, or in parallel. In addition, the requirements for unlocking one or more of the media components may vary dynamically, or the media components may vary based on: known individual characteristics of a user in a group of users, group characteristics of a subset of users within a group of users or other criteria. | 07-17-2014 |
20140198028 | DISPLAY PANEL DRIVER, METHOD OF DRIVING DISPLAY PANEL USING THE SAME AND DISPLAY APPARATUS HAVING THE SAME - A display panel driver including a moving image determining part, a sensing part and a dithering part. The moving image determining part determines whether an input image data represents a moving image or a static image. The sensing part senses a movement of a display apparatus or a user of the display apparatus. The dithering part performs a dithering operation to the input image data when the input image data represents the moving image or when at least one of the display apparatus and the user moves when the input image data represents the static image. | 07-17-2014 |
20140198029 | TECHNIQUES FOR GESTURE RECOGNITION - A method and system may provide gesture recognition techniques. An energy aggregation value may be determined for data corresponding to a gesture using principal component analysis. Based at least in part on the energy aggregation value, it may be determined whether the gesture is a one or two dimensional gesture. | 07-17-2014 |
20140198030 | IMAGE PROJECTION DEVICE, IMAGE PROJECTION SYSTEM, AND CONTROL METHOD - An image projection device for projecting an image on a projection object includes a detecting unit configured to detect a detection object present within an object space corresponding to the image projected by the image projection device; a recognizing unit configured to recognize an instruction motion of a user based on detection of the detection object by the detecting unit; and an output unit configured to generate an output corresponding to the instruction motion recognized within the object space by the recognizing unit from the instruction motion and the image projected by the image projection device. | 07-17-2014 |
20140198031 | PALM GESTURE RECOGNITION METHOD AND DEVICE AS WELL AS HUMAN-MACHINE INTERACTION METHOD AND APPARATUS - Disclosed is a palm gesture recognition method comprising a step of obtaining plural images according to an order of time; a step of acquiring plural palm shaped images from the plural images; a step of extracting plural features describing an open or closed palm gesture from each of the plural palm shaped images; a step of calculating a maximum feature difference vector formed by a maximum difference of each of the plural features; and a step of determining, on the basis of the maximum feature difference vector, that there is the open or closed palm gesture or there isn't the open or closed palm gesture in the plural images. | 07-17-2014 |
20140198032 | METHOD AND APPARATUS FOR DISPLAYING SCREEN WITH EYE TRACKING IN PORTABLE TERMINAL - A method and an apparatus for displaying a screen using an eye tracking in a portable terminal are provided. The method includes displaying a message on the display unit, photographing a user's eyeball through the camera unit when the message is displayed to determine a position of the user's eyeball, determining whether the message is read by comparing eye tracking information gathered by the photographing of the user's eyeball with a message position of a time point of displaying the message, and distinguishing an unread message from a read message based on the determining | 07-17-2014 |
20140198033 | HEAD-MOUNTED DISPLAY DEVICE, CONTROL METHOD FOR HEAD-MOUNTED DISPLAY DEVICE, AND IMAGE DISPLAY SYSTEM - A head-mounted display device of a transmission type includes an operation unit, a control unit configured to transmit image data, and an image display unit configured to cause a user to visually recognize a generated image generated on the basis of the transmitted image data from the control unit. The control unit has, as an operation mode of the image display unit, a first mode for causing the user to visually recognize a plurality of selectable images in positions other than the center in an image generable region, which is a region where the image display unit can generate the generated image, and a second mode for causing, when a singularity of the selectable image is selected out of the plurality of selectable images, the user to visually recognize an associated image associated with the one selectable image as an image larger than the one selectable image in the first mode. | 07-17-2014 |
20140198034 | MUSCLE INTERFACE DEVICE AND METHOD FOR INTERACTING WITH CONTENT DISPLAYED ON WEARABLE HEAD MOUNTED DISPLAYS - There is disclosed a muscle interface device and method for interacting with content displayed on wearable head mounted displays. In an embodiment, the muscle interface device comprises a sensor worn on the forearm of a user, and the sensor is adapted to recognize a plurality of gestures made by a user's hand and or wrist to interact with content displayed on the wearable head mounted display. The muscle interface device utilizes a plurality of sensors, including one or more of capacitive EMG, MMG, and accelerometer sensors, to detect gestures made by a user. The detected user gestures from the sensors are processed into a control signal for allowing the user to interact with content displayed on the wearable head mounted display in a discreet manner. | 07-17-2014 |
20140198035 | WEARABLE MUSCLE INTERFACE SYSTEMS, DEVICES AND METHODS THAT INTERACT WITH CONTENT DISPLAYED ON AN ELECTRONIC DISPLAY - Systems, devices and methods that enable a user to access and interact with content displayed on a portable electronic display in an inconspicuous, hands-free manner are described. There is disclosed a completely wearable system comprising a wearable muscle interface device and a wearable head-mounted display, as well as methods for using the wearable system to effect interactions between the user and content displayed on the wearable head-mounted display. The wearable muscle interface device includes muscle activity sensors worn on an arm of the user to detect muscle activity generated when the user performs a physical gesture. The wearable system is adapted to recognize a plurality of gestures made by the user and, in response to each recognized gesture, to effect one or more interaction(s) with content displayed on the wearable head-mounted display. | 07-17-2014 |
20140198036 | METHOD FOR CONTROLLING A PORTABLE APPARATUS INCLUDING A FLEXIBLE DISPLAY AND THE PORTABLE APPARATUS - An apparatus and method for controlling a portable apparatus including a flexible display are provided. The method includes detecting a bent portion of the flexible display; displaying a User Interface (UI) on the detected bent portion of the flexible display; detecting an input through the UI displayed on the detected bent portion of the display; and performing an operation corresponding to the input detected through the UI displayed on the detected bent portion of the display. | 07-17-2014 |
20140198037 | DEVICE, SYSTEM, AND METHOD FOR LOGGING NEAR FIELD COMMUNICATIONS TAG INTERACTIONS - An apparatus, method and system for categorizing, parsing, grouping and displaying Near Field Communication (NFC) tags for presentation on a user device, including storing in a computer readable medium of a log of ones of the tags read by or written by the user device, assessing at least one category for each of the logged tags, and displaying, in conjunction with at least one indicator indicative of the respective at least one category, of each of the logged tags on the user device. | 07-17-2014 |
20140198038 | DISPLAY DEVICE, CONTROL METHOD, AND PROGRAM - A display device includes a plurality of displays provided on different surfaces of a body of the display device; a stationary state determination unit that determines whether or not the body is in a stationary state; a visibility determination unit that determines which of the displays is visible to a user if it is determined by the stationary state determination unit that the body is in a stationary state; and a display control unit that displays predetermined display information on at least the display that is determined by the visibility determination unit to be visible. | 07-17-2014 |
20140198039 | MOTOR DRIVE DEVICE - In a motor drive device ( | 07-17-2014 |
20140204013 | PART AND STATE DETECTION FOR GESTURE RECOGNITION - Part and state detection for gesture recognition is useful for human-computer interaction, computer gaming, and other applications where gestures are recognized in real time. In various embodiments a decision forest classifier is used to label image elements of an input image with both part and state labels where part labels identify components of a deformable object, such as finger tips, palm, wrist, lips, laptop lid and where state labels identify configurations of a deformable object such as open, closed, up, down, spread, clenched. In various embodiments the part labels are used to calculate a center of mass of the body parts and the part labels, centers of mass and state labels are used to recognize gestures in real time or near real-time. | 07-24-2014 |
20140204014 | OPTIMIZING SELECTION OF A MEDIA OBJECT TYPE IN WHICH TO PRESENT CONTENT TO A USER OF A DEVICE - A system for optimizing selection of a media object type in which to present content to a user of the device includes a display configured to reproduce visual media type objects associated with the content, a speaker configured to reproduce audio media type objects associated with the content, a detection logic configured to detect whether the user is paying attention to a portion of the display, and a processor configured to determine a media object to present to the user of the device from a selection of media objects including media objects of several different media object types based on whether the user is paying attention to the portion of the display. | 07-24-2014 |
20140204015 | GESTURE RECOGNITION MODULE AND GESTURE RECOGNITION METHOD - A gesture recognition module, for recognizing a gesture of a user, includes a detecting unit, for capturing at least one hand image of a hand of the user, so as to sequentially acquire a first coordinate and a second coordinate; a computing unit, coupled to the detecting unit for defining a first zone and a second zone according to the first coordinate and the second coordinate, respectively, and calculating a first area and a second area according to the first zone and the second zone; and a determining unit, coupled to the detecting unit and the computing unit for recognizing the gesture according to the first coordinate, the second coordinate, the first area and the second area. | 07-24-2014 |
20140204016 | DISPLAY DIMMING IN RESPONSE TO USER - In some embodiments a detector is to detect a body of a user. A controller is to determine an area of focus of the user in response to the detector, and to dim a portion of a display that is not in the area of focus. Other embodiments are described and claimed. | 07-24-2014 |
20140204017 | ELECTRONIC DEVICE AND METHOD FOR CONTROLLING ACCESS TO THE ELECTRONIC DEVICE - A method for controlling access to an electronic device, the electronic device activates a motion sensor to detect movement parameters, of a spatial moving operation of the electronic device, when a display device of the electronic device awakes from a sleep mode. once the movement parameters detected by the motion sensor match predetermined reference parameters, the electronic device is unlocked. | 07-24-2014 |
20140204018 | INPUT METHOD, INPUT DEVICE, AND STORAGE MEDIUM - An input method that is executed by a computer includes obtaining a first image of an object using an imaging device, detecting a first feature point of the object based on a shape of the object in the first image, calculating a first angle of the object with respect to a plane on which a pressing operation is performed based on the first feature point and information on a first area of the object in the first image, and selecting a first input item from a plurality of input items based on the first angle. | 07-24-2014 |
20140204019 | INFORMATION PROCESSING DEVICE, SYSTEM, AND INFORMATION PROCESSING METHOD - An information processing device includes: a first acquiring unit that acquires first information to be used for detecting whether a viewer is present around a display unit displaying an image from a first detector that detects the first information; a second acquiring unit that acquires second information to be used for recognizing an instruction action to the image displayed by the display unit from a second detector that detects the second information; and a deciding unit that decides a recognition method of the instruction action by using the second information, based on the first information acquired by the first acquiring unit. | 07-24-2014 |
20140204020 | INTERACTIVE CONTENT CONTROL DEVICE BASED ON L-SHAPED DISPLAY AND CONTROL METHOD THEREOF - Provided are an interactive content control device and an interactive content control method capable of increasing immersion in content, which include a front display unit that displays content images, a lower side display unit which is configured to form an L shape with the front display unit, displays content images, and interworks with the front display unit, a sensor unit that senses user movements and provides detected sensing information, and a controller that controls interworking of content displayed on the front display unit and the lower side display unit based on the sensing information. Accordingly, a sense of physical distance from the user may be reduced and the user experiences more immersive content and a variety of interfaces may be utilized in content. | 07-24-2014 |
20140204021 | EYEGLASSES TYPE OPERATION DEVICE, EYEGLASSES TYPE OPERATING SYSTEM AND ELECTRONIC DEVICES - An eyeglasses type operation device comprises a frame configured to be worn on a head of a user, a sensor that outputs a detection signal by sensing, the sensor being disposed inside the frame and a connector operatively connected to the sensor to receive the detection signal, the connector being configured to detachably connect to an external electronic device, and the connector being configured to output the detection signal to the external electronic device when the external electronic device is connected to the connector. | 07-24-2014 |
20140204022 | METHOD OF OPERATING ELECTRONIC DISPLAY DEVICE AND ELECTRONIC DISPLAY DEVICE - A method for operating an electronic display device and the electronic display device are provided. The method may include configuring a plurality of modules of the electronic display device into a stand-by mode, receiving an input corresponding to a feature selected from among a plurality of features of the electronic display device, and initializing, from among the plurality of modules, one or more modules that correspond to the feature selected of the electronic display device, wherein one or more of the plurality of modules respectively correspond to one or more of the plurality of features. | 07-24-2014 |
20140204023 | TRANSPARENT DISPLAY APPARATUS AND METHOD THEREOF - A transparent display apparatus is provided. The transparent display apparatus includes a transparent display which displays information, a sensor which senses background information in a first direction, and a controller configured to modify a display state of the displayed information on the transparent display based on the sensed background information. | 07-24-2014 |
20140204024 | PORTABLE DEVICE AND METHOD FOR CONTROLLING THE SAME - A method for controlling a foldable device, includes detecting, by the foldable device, that folding of the foldable device has occurred, the foldable device including at least one motion sensor configured to detect a movement of the foldable device; determining whether motion sensing correction of the foldable device in a folded state is necessary by detecting whether the folding of the foldable device affects motion sensing of the at least one motion sensor; acquiring at least one motion sensing correction factor for performing the motion sensing correction due to the folding based on the determination result; performing motion sensing correction of the at least one motion sensor using the at least one motion sensing correction factor; and carrying out an operation in the foldable device in the folded state. | 07-24-2014 |
20140210702 | SYSTEMS AND METHODS FOR PRESENTING MESSAGES BASED ON USER ENGAGEMENT WITH A USER DEVICE - Methods and systems are described herein for presenting messages to a user based on user engagement with a user device. A message is received for presentation to a user on the user device. A value indicating an attentiveness level of the user is generated with the user device. The value indicating the attentiveness level of the user is compared with an attentiveness level threshold value. In response to determining the value indicating the attentiveness level of the user does not exceed the attentiveness level threshold value, presentation of the message is delayed until the value indicating the attentiveness level of the user exceeds the attentiveness level threshold value. | 07-31-2014 |
20140210703 | METHOD OF UNLOCKING AND SUBSEQUENT APPLICATION LAUNCH IN PORTABLE ELECTRONIC DEVICE VIA ORIENTATION SENSING - An apparatus and method for unlocking in a portable electronic device via orientation sensing are provided. The device includes a display screen and at least one orientation sensor. The method includes displaying at least one moveable object on the display screen, sensing a change in an orientation of the device, moving the moveable object in accordance with the sensed change in orientation, and unlocking the device if the moveable object moves in accordance with a predetermined movement. | 07-31-2014 |
20140210704 | GESTURE RECOGNIZING AND CONTROLLING METHOD AND DEVICE THEREOF - A gesture recognizing and controlling method and device thereof are provided. The gesture recognizing and controlling method includes the following steps. First, a pending image having depth information is captured, in which the pending image includes a human form image. The human form image is analyzed so as to obtain hand skeleton information having a first skeleton and a second skeleton. It is determined whether the first skeleton and the second skeleton have an intersection point. If yes, it is determined whether an included angle formed by the first skeleton and the second skeleton is within a predetermined angle range. When the included angle is within the predetermined angle range, a controlling signal is output accordingly. | 07-31-2014 |
20140210705 | Method and Apparatus for Controlling Screen by Tracking Head of User Through Camera Module, and Computer-Readable Recording Medium Therefor - Controlling a screen by tracking user's head using a camera module is described, comprising: (a) when a request for displaying contents is received, displaying a plurality of content regions on a screen and tracking the head of the user looking at the screen through the camera module; and (b) when it is determined in the tracking that the head of the user moves in a specific direction in a state where information on a first content is displayed in a central content region among the plurality of content regions, displaying, in the central content region, information on a second content which was displayed in a peripheral content region, the peripheral content region being positioned in the specific direction or in a direction opposite to the specific direction from the central content region. | 07-31-2014 |
20140210706 | ELECTRONIC DEVICE EMPLOYING A FLEXIBLE DISPLAY AND OPERATING METHOD THEREOF - An electronic device to which a flexible display is applied and an operating method thereof. The content display method for viewing simple content such as a current time, a date, or the number of received massages without unfolding the folded flexible display, when a flexible display is folded for carrying. | 07-31-2014 |
20140210707 | IMAGE CAPTURE SYSTEM AND METHOD - An example of an image capture system includes a support structure and a sensor arrangement, mounted to the support structure, including an image sensor, a lens, and a drive device. The image sensor has a sensor surface with a sensor surface area. The lens forms a focused image generally on the sensor surface. The area of the focused image is larger than the sensor surface area. The drive device is operably coupled to a chosen one of the lens and the image sensor for movement of the chosen one along a path parallel to the focused image. A portion of the viewing area including the object can be imaged onto the sensor surface and image data of the object, useful to determine information about the object, can be created by the image sensor to determine information regarding the object. | 07-31-2014 |
20140210708 | ELECTRONIC SYSTEM WITH DISPLAY MODE MECHANISM AND METHOD OF OPERATION THEREOF - An electronic system includes: a control unit configured to: detect a motion applied to a device with a display in a mode; select another mode of the display based on the device motion; and a user interface, coupled to the control unit, configured to apply a screen motion to the another mode of the display. | 07-31-2014 |
20140210709 | BIO SIGNAL BASED MOBILE DEVICE APPLICATIONS - Techniques for providing bio signal based mobile device applications are disclosed. In some embodiments, a system for bio signal based mobile device applications includes a bio signal sensor (e.g., biosensor) or multiple biosensors (e.g., multiple biosensors for EEG detection, or multiple biosensors that can each detect different types of bio signals), a bio signal processing unit, the mobile device, and a software application(s) that utilize the bio signal information for various applications (e.g., practical applications, entertainment applications, social networking applications, and/or other applications). | 07-31-2014 |
20140210710 | METHOD FOR GENERATING AN AUGMENTED REALITY CONTENT AND TERMINAL USING THE SAME - A method for generating augmented reality content includes receiving an augmented reality mode execution request for generating augmented reality content based on electronic book content, activating a camera in response to the request to provide a preview image, loading book data of the electronic book on the preview image, detecting an event occurring in the augmented reality mode, retrieving reaction data pre-mapped to the event, and loading the reaction data on the preview image. | 07-31-2014 |
20140218278 | PORTABLE DEVICE AND OPERATION METHOD THEREFOR - A portable device and an operating method thereof are provided, wherein the portable device includes a key and a display, and the display is in a turned-off status. The operation method includes sensing an operating motion via the key, and producing a control signal to control the portable device according to the operating motion. The control signal turns on the display to show a read-only frame when the operating motion only includes a contact motion without inducing a physical displacement of the key. Alternatively, the control signal turns on the display to show an operational frame when the operating motion contains a contact motion and a physical motion inducing the physical displacement of the key. | 08-07-2014 |
20140218279 | STORAGE MEDIUM STORING INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD - In accordance with an attitude of a portable display device calculated based on data output from a sensor and an operation made on an operation unit, in a display target image, a first display range in which a part of the display target image is to be displayed on the portable display device is set, and in the display target image, an image in the first display range is displayed on the portable display device. | 08-07-2014 |
20140218280 | PERIPHERAL DEVICE WITH MULTI-TRANSMISSION CAPABILITY - The present disclosure provides a wireless peripheral device with multi-transmission capability. The wireless peripheral device wirelessly transmits a control signal to a wireless receiver. The wireless peripheral device comprises a first wireless transmitting unit, a second transmitting unit, and a processing unit. The second transmitting unit receives a responding signal from the wireless receiver. The capable transmission distance of the first transmitting unit is larger than the capable transmission distance of the second transmitting unit. The processing unit is coupled to the first transmitting unit and the second transmitting unit. The processing unit transmits the control signal through the second transmitting unit to the wireless receiver when the strength of the responding signal is at a high intensity range. The processing unit transmits the control signal through the first transmitting unit to the wireless receiver when the strength of the responding signal is at a low intensity range. | 08-07-2014 |
20140218281 | SYSTEMS AND METHODS FOR EYE GAZE DETERMINATION - Devices and methods are provided for eye and gaze tracking determination. In one embodiment, a method for compensating for movement of a wearable eye tracking device relative to a user's eye is provided that includes wearing a wearable device on a user's head such that one or more endo-cameras are positioned to acquire images of one or both of the user's eyes, and an exo-camera is positioned to acquire images of the user's surroundings; calculating the location of features in a user's eye that cannot be directly observed from images of the eye acquired by an endo-camera; and spatially transforming camera coordinate systems of the exo- and endo-cameras to place calculated eye features in a known location and alignment. | 08-07-2014 |
20140218282 | Medical Cart - A medical cart includes a cart body, a top platform, and an elevating device. The elevating device is mounted on the cart body for supporting the top platform above the cart body and is for moving the top platform upwardly and downwardly. The elevating device includes a sleeve, a screw rod, a motor and a motor driving module. The screw rod extends through the sleeve and engages threadedly a threaded inner surface of the sleeve. The motor is connected to the screw rod and is for driving the screw rod to rotate and move with respect to the sleeve, such that the top platform is driven by the screw rod to move upwardly and downwardly with respect to the cart body. | 08-07-2014 |
20140218283 | METHOD OF CONTROLLING AN OPERATION OF A CAMERA APPARATUS AND A CAMERA APPARATUS - A method and apparatus for controlling an operation of a camera that allows a user to conveniently control a camera apparatus according to a gesture of a subject input through a lens of a camera, and the camera apparatus are provided. The method includes receiving an image input through a camera lens; generating an image frame; detecting a motion of a subject included in the image frame by comparing the image frame with at least one previous frame stored before the image frame is generated; determining whether the motion of the subject is a User Interface (UI) gesture; and performing, if the motion is the UI gesture, an operation corresponding to the UI gesture. | 08-07-2014 |
20140218284 | INTEGRATED FRONT LIGHT SOLUTION - An integrated illumination apparatus includes a light injection portion having a first end for receiving light. The light injection portion supports propagation of light along the length of the light injection portion. Turning microstructure disposed on a first side of the light injection portion is configured to turn light incident on the first side and to direct light out a second opposite side of the light injection portion. A slit disposed along the length of the light injection portion forms an optical interface on the second opposite side of the light injection portion. The optical interface further transmits light turned by the turning microstructure. A light distribution portion is disposed with respect to the slit to receive the light transmitted through the slit. At least one bridge mechanically connects the light injection portion to the light distribution portion. | 08-07-2014 |
20140218285 | Method and Apparatus for Measuring Audience Size for a Digital Sign - Measuring audience size for a digital sign comprises generating a plurality of paths, one for each face detected in a first sequence of video frames captured by a camera proximate the digital sign, and generating a zone in the sequence of video frames through which passes a threshold number of the paths. Motion and direction of motion within the zone is then measured in a second sequence of video frames to calculate the audience size that passes through the zone in the second sequence of video frames. | 08-07-2014 |
20140218286 | LAST SCREEN RENDERING FOR ELECTRONIC BOOK READER - A handheld dedicated electronic book (“eBook”) reader device and last screen rendering techniques for enhancing user experience are described. The eBook reader device detects certain screen conversion events, such as a timeout period, a scheduled event, or an event derived from user behavior. Upon detection of such events, the eBook reader device renders, as the last screen image to remain visible after the user ceases using the device, an image that conveys to the user some meaningful association with a content item. In the context of eBooks, the eBook reader device renders a representation of the book cover as the last screen image. A progress indicator may further be included to represent user progress through the content item. | 08-07-2014 |
20140218287 | System and Method for Controlling a Display of a Mobile Device - A method and system are provided for controlling the display of an mobile device by: capturing an image using a camera device of the mobile device, the camera device being directed in a same direction as a display of the mobile device, the image comprising one or more subjects (e.g. users or other humans seen in the image); determining a point of regard in the image for at least one of the one or more subjects, the point of regard being indicative of an area on the display at which a gaze of the corresponding subject is directed; determining, based on the point of regard, an instruction for controlling the display; and controlling the display according to the instruction, wherein controlling the display includes reducing visibility of at least one portion of what is displayed on the display. | 08-07-2014 |
20140218288 | DISPLAY DEVICE, DISPLAY CONTROL METHOD, AND PROGRAM - A display device includes device body | 08-07-2014 |
20140232630 | Transparent Display Field of View Region Determination - A method and system for determining a field of view region on a transparent display is provided. The method includes receiving from a user facing device, a user image. User image key features of the user image are identified and user image attributes of said image key features are analyzed. An object image of objects is received and a first object is identified. Object key features of the first object are identified and object attributes of object key features are analyzed. A specified position on a transparent display for displaying a first image associated with the first object is determined. The first image is displayed at the specified position on the transparent display. | 08-21-2014 |
20140232631 | MODEL-BASED MULTI-HYPOTHESIS TARGET TRACKER - The present disclosure describes a target tracker that evaluates frames of data of one or more targets, such as a body part, body, and/or object, acquired by a depth camera. Positions of the joints of the target(s) in the previous frame and the data from a current frame are used to determine the positions of the joints of the target(s) in the current frame. To perform this task, the tracker proposes several hypotheses and then evaluates the data to validate the respective hypotheses. The hypothesis that best fits the data generated by the depth camera is selected, and the joints of the target(s) are mapped accordingly. | 08-21-2014 |
20140232632 | INTERACTIVE BADGE - The subject disclosure is directed towards a wearable interactive device, such as a wearable identity badge. When a user moves the device, such as to position a display (e.g., part) of the device a sensed distance at a sensed horizontal and vertical angle, the device outputs content that is based on the position. Context data also may be used in determining the content to output, as well as any other sensed data that may be available. | 08-21-2014 |
20140232633 | APPARATUS AND METHOD FOR AUTOMATICALLY ACTIVATING A CAMERA APPLICATION BASED ON DETECTING AN INTENT TO CAPTURE A PHOTOGRAPH OR A VIDEO - A method of automatically activating a camera application implemented in a mobile device in locked mode starts with the processor receiving a first signal from an accelerometer. The device's processor activates the camera application when the processor determines that the mobile device has remained in a stationary portrait or landscape position for a period of time based on the first signal. Activating the camera application includes signaling by the processor to the display device to display a camera screen from a locked screen. The processor may also receive a second signal from a proximity sensor that detects presence of a nearby object to the mobile device. When the processor determines that there is presence of the nearby object to the mobile device based on the second signal, the mobile device remains in locked mode and the processor does not activate the camera application. | 08-21-2014 |
20140232634 | TOUCH-BASED GESTURES MODIFIED BY GYROSCOPE AND ACCELEROMETER - A mobile device including a touchscreen display presents an image of a three-dimensional object. The display can concurrently present a user interface element that can be in the form of a virtual button. While the device's user touches and maintains fingertip contact with the virtual button via the touchscreen, the mobile device can operate in a special mode in which physical tilting of the mobile device about physical spatial axes causes the mobile device to adjust the presentation of the image of the three-dimensional object on the display, causing the object to be rendered from different viewpoints in the virtual space that the object virtually occupies. The mobile device can detect such physical tilting based on feedback from a gyroscope and accelerometer contained within the device. | 08-21-2014 |
20140232635 | MOBILE DEVICES FOR TRANSMITTING AND RECEIVING DATA USING GESTURE - A mobile device configured for data transmission to a corresponding mobile device is provided. The mobile device may include a gesture input unit configured to receive a gesture, a gesture determination unit configured to determine whether the gesture corresponds to a preset gesture associated with a command to perform data transmission to the corresponding mobile device, and a data communication unit configured to transmit a data transmission request to the corresponding mobile device based on a result of the determination, configured to receive, from the corresponding mobile device, an acceptance signal indicating an input of an acceptance gesture at the corresponding mobile device, and configured to transmit data to the corresponding mobile device in response to receiving the acceptance signal. | 08-21-2014 |
20140232636 | IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD - An image processing device includes, a processor; and a memory which stores a plurality of instructions, which when executed by the processor, cause the processor to execute, acquiring an image including a subject recognized and a movement site of a user; recognizing the subject recognized and the movement site from the image; and controlling a display location, in the image, of an additional information image corresponding to the subject recognized, to a location other than the subject recognized and the movement site. | 08-21-2014 |
20140232637 | HEAD MOUNTED DISPLAY APPARATUS AND CONTENTS DISPLAY METHOD - Disclosed are a head-mounted display apparatus and a contents display method. The head-mounted display apparatus includes: a mobile device tracing information processing unit for receiving position information or orientation information of a mobile device and generating tracing information of the mobile device based on the received position information or orientation information of the mobile device; a gesture processing unit for receiving input information of the mobile device and generating gesture information to change an output format of contents by using the received input information; a rendering unit for generating a predetermined virtually augmented contents image based on the tracing information of the mobile device and the gesture information; and a display unit for displaying the generated virtually augmented contents image. | 08-21-2014 |
20140232638 | METHOD AND APPARATUS FOR USER INTERFACE USING GAZE INTERACTION - A method and apparatus for a user interface using a gaze interaction is disclosed. The method for the user interface using the gaze interaction may include obtaining an image including eyes of a user, estimating a gaze position of the user, using the image including the eyes of the user, and determining whether to activate a gaze adjustment function for controlling a device by a gaze of the user, based on the gaze position of the user with respect to at least one toggle area on a display. | 08-21-2014 |
20140232639 | INFORMATION PROCESSING APPARATUS AND STORAGE MEDIUM - There is provided an information processing apparatus including a detection unit configured to detect a gaze point of a user in a display image displayed on a display unit, an estimation unit configured to estimate an intention of the user based on the gaze point detected by the detection unit, an image generation unit configured to generates a varying image that subtly varies from the display image to a final display image according to the intention estimated by the estimation unit, and a display control unit configured to control the display unit in a manner that the varying image generated by the image generation unit is displayed. | 08-21-2014 |
20140232640 | DYNAMIC RANGE RESETTING - A system and method of determining that a movement of an eye, head or body part, as is captured in a series of images, is a movement intended to activate a function, by defining ranges within which the movement is deemed as not intended for function activation, and beyond which the movement is deemed as intended for function activation. The ranges may be adjusted dynamically to account for shifts in a resting position of the eye, head or body part. | 08-21-2014 |
20140232641 | INFORMATION PROCESSING DEVICE AND CONTROL METHOD FOR INFORMATION PROCESSING DEVICE - An information processing device includes an information acquisition unit that acquires at least one of specific information regarding an environment where the information processing device is used, biological information of a user who uses the information processing device, and control information for controlling the information processing device, and a control unit that controls the information processing device on the basis of at least one of the acquired specific information, biological information, and control information. | 08-21-2014 |
20140232642 | Method of Temporal Segmentation of an Instrumented Gesture, Associated Device and Terminal - Temporally segmenting an instrumented gesture executed by a user with a terminal having an inertial navigation module, which measures a vector of inertial characteristics representative of movement of the terminal. Segmenting includes, at each current instant: calculating an instantaneous power value of the vector; estimating a gesture indicator based on variation between the instantaneous power value and a mean power value estimated over a preceding time window; determining a start of gesture at a first instant, when the estimated gesture indicator is greater than or equal to a first threshold during a time interval greater than or equal to a first interval; and determining an end of gesture at a second instant when, at the current instant, the estimated gesture indicator is less than or equal to a second threshold during a time interval greater than or equal to a second time interval. | 08-21-2014 |
20140232643 | LASER PROJECTION DEVICE AND METHOD - A laser projection device is described having a projection unit, which is designed to project an image and to generate an interrupt signal as a function of a distance of the projection unit from at least one object located in front of the projection unit in the projection direction, and having an application control unit, which is designed to control the projection unit as a function of the interrupt signal. A method is also described. | 08-21-2014 |
20140232644 | Gesture Control - Gesture control uses electromagnetic power signatures. A signal is received and a power of the signal is determined. The power is associated to a command, and the command is executed in response to a gesture. | 08-21-2014 |
20140232645 | METHOD AND APPARATUS FOR PROVIDING INPUT THROUGH AN APPARATUS CONFIGURED TO PROVIDE FOR DISPLAY OF AN IMAGE - Provided herein is a technique by which a user may interact with an apparatus configured to provide for display of an image, such as with augmented reality glasses. An example embodiment may provide a method including receiving an indication of a first motion event initiated on a first side of a device from a motion sensor, determining a first motion event pattern based on one or more directional components of the first motion event, distinguishing the first motion event from a motion event initiated on a second side of the device, correlating a first operation with the first motion event pattern, and causing the first operation to be performed. The first operation may include causing the opacity of an image presented on a substantially transparent display to be increased. | 08-21-2014 |
20140232646 | DIELECTRIC ELASTOMER MEMBRANE FEEDBACK APPARATUS, SYSTEM AND METHOD - A feedback enabled system, module, and method are disclosed. The feedback enabled system comprises a first feedback module. The first feedback module comprises a membrane (thin film); a frame; a motion coupling, wherein when a voltage is applied to the membrane (thin film), the motion coupling exerts a force on the frame to provide feedback; and a user interface, wherein the first feedback module is configured to provide feedback through the user interface. The method comprises applying a first voltage with a first waveform to a first feedback module, the first feedback module comprising a dielectric elastomer membrane (thin film), a frame, and a motion coupling, wherein, when the first voltage is applied to the dielectric elastomer membrane (thin film), the motion coupling exerts a force on the frame. | 08-21-2014 |
20140232647 | AUTO-STEREOSCOPIC DISPLAY CONTROL - An apparatus, a method and a non-transitory computer readable medium is provided. The apparatus includes: at least one processor; and at least one memory storing computer program instructions configured, working with the at least one processor, to cause the apparatus to perform at least the following: detecting bending of a flexible auto-stereoscopic display comprising a parallax barrier arrangement; and compensating for movement of the parallax barrier arrangement, caused by the bending of the flexible auto-stereoscopic display, by adjusting one or more characteristics of the flexible auto-stereoscopic display in dependence upon the bending of the flexible auto-stereoscopic display. | 08-21-2014 |
20140232648 | DISPLAY APPARATUS AND CONTENTS DISPLAY METHOD - Disclosed are a display apparatus and a contents display method. The display apparatus includes: a mobile device tracing information processing unit for receiving position information or orientation information of a mobile device and generating tracing information of the mobile device based on the received position information or orientation information of the mobile device; a gesture processing unit for receiving input information of the mobile device and generating gesture information to change an output format of contents by using the received input information; a rendering unit for generating a predetermined contents image based on the tracing information of the mobile device and the gesture information; and a display unit for displaying the generated contents image. | 08-21-2014 |
20140240212 | TRACKING DEVICE TILT CALIBRATION USING A VISION SYSTEM - A system and method for displaying digital graphics on a computer's display are disclosed. The method includes the steps of connecting a vision system to the computer, wherein the vision system is adapted to monitor a visual space. The method further includes the steps of detecting, by the vision system, a tracking object in the visual space, the tracking object having an at-rest tilt angle, and outputting, by the vision system to the computer, spacial coordinate data representative of the location of the tracking object within the visual space. The method further includes the steps of executing a graphics application program, mapping a horizontal and vertical portion of the spatial coordinate data to a display connected to the computer, and calibrating the tracking object to establish the at-rest tilt angle as a default value in the graphics application program. | 08-28-2014 |
20140240213 | MULTI-RESOLUTION GESTURE RECOGNITION - Systems and methods for detecting and analyzing bodies visible to an input source are disclosed herein. Analysis of detected bodies can be used in gesture recognition. In particular embodiments, detection and recognition of hand and portions thereof, including fingers and portions thereof (e.g., fingertips) can be performed to recognize gestures related to the hand. Recognition and analysis of bodies visible to an input source can occur using, for example, blob detection techniques, machine learning techniques, and various calculative techniques including statistical analysis, application-specific equations, and estimations related to bodies anticipated to be visible to the input source. | 08-28-2014 |
20140240214 | Glove Interface Apparatus for Computer-Based Devices - A glove interface apparatus for computer-based devices includes a glove having a plurality of contacts and a thumb contact that together render a completed electrical circuit when the thumb contact touches any of the plurality of contacts. Each of the resulting circuits is configured to present a unique voltage which is coupled to a processor which determines a character signal representative of the unique voltage and transmits that signal to a compatible computer-based device. | 08-28-2014 |
20140240215 | SYSTEM AND METHOD FOR CONTROLLING A USER INTERFACE UTILITY USING A VISION SYSTEM - A method for controlling a user interface utility in a graphics application program executing on a computer is disclosed. The method includes a step of connecting a vision system to the computer, wherein the vision system is adapted to monitor a visual space. The method further includes a step of detecting, by the vision system, a tracking object in the visual space. The method further includes a step of executing, by the computer, a graphics application program, and outputting, by the vision system to the computer, spatial coordinate data representative of the location of the tracking object within the visual space. The method further includes a step of controlling, with the spatial coordinate data output by the vision system, the rendering of a user interface utility within the graphics application program to a display connected to the computer. | 08-28-2014 |
20140240216 | Devices And Methods For Displaying Data In Response To Detected Events - A method and a device is disclosed, whereby the device can be a first electronic device. The first electronic device is adapted to be coupled to a second electronic device. The method comprises at the first electronic device certain steps which can be performed by a processor of the first electronic device. In particular, there is a step of generating first data for display at the first electronic device pertaining to an application executable on the first electronic device. Also, there is the step of receiving a signal at the first electronic device from a second electronic device in response to an event detected at the second electronic device. Finally, in response to the received signal, there is modifying the first data, and generating second data for display at the second electronic device pertaining to the application and sending the second data to the second electronic device. | 08-28-2014 |
20140240217 | SYSTEM COMPRISING AN OCULOMETER, METHOD IMPLEMENTED ON SUCH A SYSTEM AND CORRESPONDING COMPUTER PROGRAM PRODUCT - The invention relates to a system comprising an oculometer for measuring movements of at least one eye of an operator, wherein it comprises furthermore a display for displaying, for the eye, a frame of at least one point performing a blinking, so that the operator can perform at least one smooth tracking movement of the eye while looking at the display while operating. The invention also relates to a method implemented on such a system, and a corresponding computer program product. | 08-28-2014 |
20140240218 | HEAD TRACKING METHOD AND DEVICE - The invention extends to a tracking device for tracking a position of a moving object such as a human head or eyes, the device comprising a camera, a radiation source radiating electro-magnetic radiation, and a processor for calculating variables indicative of the position of an object relative to the camera, wherein the camera is adapted to capture images using illumination provided by the radiation source, wherein the radiation source comprises a source of infrared radiation and the camera comprises a monocular image input. Further aspects of the invention relate to an associated method for tracking a moving object; to quickly sorting a set of competing models of the users head; the use of threshold conversion to distinguish characteristics of captured images, and controlling the output of a three dimensional display in dependence on the tracked position of a user's head. | 08-28-2014 |
20140240219 | DISPLAY APPARATUS - A display apparatus and/or the method using the same which provide user interaction are provided. The display apparatus includes: a display body including an image display surface; and a photographing unit including a camera module to photograph a user to sense a user's gesture, and placed in a first position in which the photographing unit is accommodated in the display body when the photographing unit does not photograph, and placed in a second position in which the photographing unit protrudes out of the display body when the photographing unit photographs. | 08-28-2014 |
20140240220 | METHOD FOR CONTROLLING OPERATION AND ELECTRONIC DEVICE THEREOF - A method and apparatus for controlling an operation of an electronic device are provided. The method for controlling the operation of the electronic device includes, when an event for controlling an operation of the electronic device using a feature occurs, analyzing images being input from a camera in real-time, determining a reference position based on a position of the feature within at least one image among the images being input, determining a reference region for determining the movement or non-movement of the feature, based on the determined reference position, and changing the reference position according to a position of the feature moved within the reference region. | 08-28-2014 |
20140240221 | Electronic Apparatus And Method - An electronic apparatus and a method are disclosed. The electronic apparatus can be worn on a wrist of a user and can have established a data connection to a remote apparatus. The electronic apparatus includes a first image collecting unit for collecting an image; a storing unit for storing image collected by the first image collecting unit; a gesture detecting unit for detecting a gesture of the user; a data transmitting unit for transmitting data with the remote apparatus; a control unit for activating the first image collecting unit when it is determined that a first gesture of the user is detected; and activating the data transmitting unit to transmit the image stored in the storing unit to the remote apparatus when it is determined that a second gesture of the user is detected. | 08-28-2014 |
20140240222 | IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM - According to one embodiment, a photographing image including a user region is acquired. The user region is detected from the photographing image. The user region in the photographing image is changed so that a user's direction in the user region cannot be decided, and the photographing image in which the user region is changed is acquired as a change image. A position of the user region in the photographing image is calculated. An image of a back side-cloth is compounded with the user region in the change image at the position. | 08-28-2014 |
20140240223 | METHOD AND APPARATUS FOR ANALYZING CAPACITIVE EMG AND IMU SENSOR SIGNALS FOR GESTURE CONTROL - There is disclosed a muscle interface device for use with controllable connected devices. In an embodiment, the muscle interface device comprises a sensor worn on the forearm of a user, and the sensor is adapted to recognize a plurality of gestures made by a user to interact with a controllable connected device. The muscle interface device utilizes a plurality of sensors, including one or more of capacitive EMG sensors and an IMU sensor, to detect gestures made by a user. Other types of sensors including MMG sensors may also be used. The detected user gestures from the sensors are processed into a control signal for allowing the user to interact with content displayed on the controllable connected device. | 08-28-2014 |
20140240224 | Display Method And Eletronic Device - A display method and an electronic device are described. The display method is applied in an electronic device that includes a display unit and at least one sensing unit. The method includes obtaining first sensing data by a first sensing unit; determining first display information according to the first sensing data; judging whether a predetermined condition is met or not to obtain a judgment result; and selecting a first mode or a second mode to be used to determine a display position of the first display information on the display unit according to the judgment result, wherein the first mode is different from the second mode; and displaying the first display information at the display position. | 08-28-2014 |
20140240225 | METHOD FOR TOUCHLESS CONTROL OF A DEVICE - The invention relates to a system and method for computer vision based control of a device which includes, in one embodiment, obtaining a series of images of a field of view, the field of view including a user's hand; identifying a shape of the user's hand in an image from the series of images; identifying a location within the image of the user's hand in a pre-defined shape; and controlling the device based on an overlap of the location of the hand in the pre-defined and a location of an object in the image. | 08-28-2014 |
20140240226 | User Interface Apparatus - A user interface apparatus includes a support structure, a display, a memory, and a processor. The support structure is configured to be supported on the head of a user. The display is supported by the support structure. The memory is supported by the support structure and includes program instructions. The processor is supported by the support structure and is operably connected to the display and to the memory. The processor is configured to execute the program instructions to (i) establish a communication link with a configurable device, (ii) receive interface data from the configurable device, (iii) generate optimized display data using the received interface data, (iv) render the optimized display data on the display, (v) receive a user selection of the rendered optimized display data, and (vi) transmit a control signal to the configurable device based upon the received user selection. | 08-28-2014 |
20140240227 | SYSTEM AND METHOD FOR CALIBRATING A TRACKING OBJECT IN A VISION SYSTEM - A method for calibrating a tracking object for use in a graphics application program executing on a computer is disclosed. The method includes a step of connecting a vision system to the computer, wherein the vision system is adapted to monitor a visual space. The method further includes a step of detecting, by the vision system, a tracking object in the visual space. The tracking object has an arcuate motion when guided by a user. The method further includes a step of executing, by the computer, a graphics application program, and outputting, by the vision system to the computer, spatial coordinate data representative of the location of the tracking object within the visual space. The method further includes a step of calibrating the tracking object to compensate for the arcuate motion. | 08-28-2014 |
20140240228 | USER INTERFACE DISPLAY DEVICE - An optical panel having an image-forming function is disposed in parallel with a virtual horizontal plane so that the optical axis thereof is orthogonal to the virtual horizontal plane. A flat panel display is disposed in offset relation below the optical panel such that a display surface of the flat panel display is inclined at a predetermined angle with respect to the virtual horizontal plane and faces upward. A light source for projecting light toward a hand, and a camera for imaging the reflection of the light from the hand are provided below or above a spatial image image-formed above the optical panel. This provides a user interface display device which does not include structure serving as an obstacle to manipulation around the spatial image projected in space to achieve an interaction with the spatial image by using the hand of the operator in a natural manner. | 08-28-2014 |
20140247206 | ADAPTIVE SENSOR SAMPLING FOR POWER EFFICIENT CONTEXT AWARE INFERENCES - Disclosed is a system, apparatus, computer readable storage medium, and method to perform a context inference for a mobile device. In one embodiment, a data processing system includes a processor and a storage device configurable to store instructions to perform a context inference for the data processing system. Data may be received from at least a first sensor, and a first classification of the data from the sensor may be performed. Confidence for the first classification can be determined and a second sensor can be activated based on a determination that the confidence fails to meet a confidence threshold. A data sample classification from the activated second sensor may be classified jointly with the data from first sensor | 09-04-2014 |
20140247207 | Causing Specific Location of an Object Provided to a Device - Techniques for causing a specific location of an object provided to a shared device. These techniques may include connecting the computing device with an individual device. The individual device may transmit the object to the shared device and displayed at an initial object position on a display of the shared device. The initial object position may be updated in response to movement of the individual device, and the object may be displayed at the updated object position on the display. The object position may be locked in response to a signal, and the object may be displayed at the locked object position. | 09-04-2014 |
20140247208 | INVOKING AND WAKING A COMPUTING DEVICE FROM STAND-BY MODE BASED ON GAZE DETECTION - Waking a computing device from a stand-by mode may include determining a wake zone relative to a display device and, when the computing device is in stand-by mode, detecting a gaze point relative to the display device. In response to determining that the gaze point is within the wake zone, a wake command is generated and passed to a program module, such as the operating system, to cause the program module to wake the computing device from the stand-by mode. When the computing device is not in stand-by mode, another gaze point may be detected and, in response to determining that the other gaze point is within the vicinity of a selectable stand-by icon, the stand-by command is generated and passed to the program module to cause the program module to place the computing device into the stand-by mode. | 09-04-2014 |
20140247209 | METHOD, SYSTEM, AND APPARATUS FOR IMAGE PROJECTION - A projection system includes: a projecting unit that projects an image; a recognition unit that recognizes an instruction action performed by a target person toward an image projected by the projecting unit and a target object based on detection information obtained by a detection apparatus; a determination unit that determines a processing condition to be applied to the image based on a recognition result by the recognition unit; a processing unit that processes the image according to the processing condition determined by the determination unit; and a control unit that controls image projection performed by the projection unit based on the image processed by the processing unit. | 09-04-2014 |
20140247210 | ZONAL GAZE DRIVEN INTERACTION - A computer system can be controlled with non-contact inputs through zonal control. In an embodiment, a non-contact input that is an eye-tracking device is used to track the gaze of a user. A computer's display, and beyond, can be separated into a number of discrete zones according to a configuration. Each zone is associated with a computer function. The zones and/or their functions can, but need not, be indicated to the user. The user can perform the various functions by moving gaze towards the zone associated with that function and providing an activation signal of intent. The activation signal of intent can be a contact-required or non-contact action, such as a button press or dwelling gaze, respectively. | 09-04-2014 |
20140247211 | IMAGE DISPLAY APPARATUS - An image display apparatus capable of ensuring a wide view range is provided. The image display apparatus includes: an image display panel; a backlight located on a back surface side of the image display panel; prisms located between the image display panel and the backlight and configured to deflect incident light; a liquid crystal layer located between the image display panel and the backlight and configured to change a deflection direction of emitted light by changing a refractive index thereof according to a voltage applied thereto; a position detection section configured to detect a position of a user; and a control section configured to control the voltage applied to the liquid crystal layer, on the basis of information of the position of the user detected by the position detection section. | 09-04-2014 |
20140247212 | Gesture Recognition Techniques - In one or more implementations, a static geometry model is generated, from one or more images of a physical environment captured using a camera, using one or more static objects to model corresponding one or more objects in the physical environment. Interaction of a dynamic object with at least one of the static objects is identified by analyzing at least one image and a gesture is recognized from the identified interaction of the dynamic object with the at least one of the static objects to initiate an operation of the computing device. | 09-04-2014 |
20140253427 | GESTURE BASED COMMANDS - Techniques described herein enable a computing device to detect a user hand gesture in near field using less expensive hardware by capturing enough information in coarse resolution. Furthermore, techniques described herein detect an object, such as a hand by characterizing the object without fully reconstructing the object. In one embodiment, embodiments of the invention performed by the computing device detect the number of unfurled fingers passing through the detectable region of the field of view of the detection surface of a computing device and the direction of the movement of the user's hand. The computing device may determine a user command in response to detecting the user gesture and provide feedback to the user. | 09-11-2014 |
20140253428 | GESTURE RECOGNITION SYSTEM OPERABILITY VERIFICATION - A gesture recognition system and method verifies the operability of the gesture system by providing a test target that is configured to generate a test stimulus that is recognizable by the gesture recognition system. The test stimulus from the test target is received and processed in the gesture recognition system to generate a test response. The processor verifies that the test response corresponds to the test stimulus. | 09-11-2014 |
20140253429 | VISUAL LANGUAGE FOR HUMAN COMPUTER INTERFACES - Embodiments of the invention recognize human visual gestures, as captured by image and video sensors, to develop a visual language for a variety of human computer interfaces. One embodiment provides a method for recognizing a hand gesture positioned by a user hand. The method includes steps of capturing a digital color image of a user hand against a background, applying a general parametric model to the digital color image of the user hand to generate a specific parametric template of the user hand, receiving a second digital image of the user hand positioned to represent a hand gesture, detecting a hand contour of the hand gesture based at least in part on the specific parametric template of the user hand, and recognizing the hand gesture based at least in part on the detected hand contour. Other embodiments include recognizing hand gestures, facial gestures or body gestures captured in a video. | 09-11-2014 |
20140253430 | PROVIDING EVENTS RESPONSIVE TO SPATIAL GESTURES - Systems and methods for processing spatial gestures are provided. In some aspects, depth data is received from one or more depth cameras. Positions of a plurality of body parts of a person in a field of view of the one or more depth cameras are determined based on the received depth data. A spatial gesture made by the person is determined based on the positions of the plurality of body parts. The spatial gesture is translated into an event. The event is provided, via a two-way socket, to a web client for executing a function in response to the event. | 09-11-2014 |
20140253431 | PROVIDING A GESTURE-BASED INTERFACE - Systems and methods for providing a gesture-based interface are provided. In some aspects, depth data indicative of a person interacting with one or more display devices is received. The one or more display devices display a plurality of columns. Each of the plurality of columns includes a plurality of icons. A column corresponding to a position of the person with respect to the one or more display devices is determined using the depth data. The column is from among the plurality of columns displayed at the one or more display devices. A signal for expanding a predetermined icon in the column corresponding to the position of the person with respect to the one or more display devices is provided. | 09-11-2014 |
20140253432 | SENSOR-MONITORED, 3D, INTERACTIVE HOLOGRAPHIC FREESPACE CONTROL UNIT (HFCU) AND ASSOCIATED COMPONENTS - A sensor/camera-monitored, virtual image or holographic-type display device is provided and called a SENSOR-MONITORED, 3D, INTERACTIVE HOLOGRAPHIC FREESPACE CONTROL UNIT (HFCU). This invention allows user interactions with the virtual-images or holograms without contacting any physical surface; it is achieved via the use of a visually bounded freespace both with and without holographic-type assistance. The 3D Holographic-type Freespace Control Unit (HFCU) and the Gesture-Controlled 3D Interface Freespace (GCIF) are implemented to produce external and internal commands. The built hardware of the present invention include concave and convex mirror slices at the size, curvature(s), repetition, and locations so as to create the desired holograms/virtual images, optical real-object generation pieces (projectors, digital screens, other mediums as desired, etc.) placed to create the associated virtual images/holograms, sensor(s) for monitoring holographic-type spaces and reporting data, and the computer and software pieces/code used to analyze the collected data and execute further commands as directed | 09-11-2014 |
20140253433 | IMAGE PROJECTION APPARATUS, SYSTEM, AND IMAGE PROJECTION METHOD - An image projection apparatus includes a projection unit configured to project an image onto a projection target; a recognition unit configured to recognize an instruction action for the image being projected by the projection unit; a storage control unit configured to store correspondence information in which multiple instruction actions are associated with image output controls, respectively, for each type of image in a storage device; a determination unit configured to determine, on the basis of the type of the image being projected by the projection unit, correspondence information for image output control; and a projection control unit configured to perform, on the basis of the correspondence information determined by the determination unit and the instruction action recognized by the recognition unit, image projection control corresponding to the image output control that is associated with the instruction action. | 09-11-2014 |
20140253434 | METHOD AND SYSTEM FOR A NEW-ERA ELECTRONIC BOOK - Different methods and systems for a new-era electronic book are disclosed. In one embodiment, the content of the book can be presented by an electronic book device. The device includes a display to present the content to a user; a sensor to sense the user; and a processor. The processor can become aware of an attribute of the user, and change a portion of the content of the electronic book to be presented accordingly. In one embodiment, the attribute is not a behavior of the user. For example, the attribute can be a location of the user. The book can include a story and a non-story theme. The story theme can convey a concept, and the non-story theme can include scenes, scripts and/or props. The portion changed can be the story or the non-story theme, or both. | 09-11-2014 |
20140253435 | IN-AIR ULTRASONIC RANGEFINDING AND ANGLE ESTIMATION - An apparatus for determining location of a moveable object in relation to an input device includes an array of one or more piezoelectric micromachined ultrasonic transducer (pMUT) elements and a processor. The array is formed from a common substrate. The one or more pMUT elements include one or more transmitters and one or more receivers. The processor configured to determine a location of a moveable object in relation to an input device using sound waves that are emitted from the one or more transmitters, reflected from the moveable object, and received by the one or more receivers. | 09-11-2014 |
20140253436 | Viewpoint Change on a Display Device Based on Movement of the Device - Embodiments of the disclosed technology comprise a handheld display device with built-in accelerometer and, in some embodiments, compass. The display of a human figure is changed based on a change in viewpoint/orientation of the device. That is, upon detecting a change in viewpoint (e.g., viewing angle, tilt, roll, or pitch of the device), the image of the person changes. This may be used with a still picture of a person, such as for the sale of clothing, or in conjunction with moving images, such as for a sports or exercise instructional video. | 09-11-2014 |
20140253437 | Automatic Text Scrolling On A Display Device - A see-through head-mounted display (HMD) device, e.g., in the form of glasses, provides view an augmented reality image including text, such as in an electronic book or magazine, word processing document, email, karaoke, teleprompter or other public speaking assistance application. The presentation of text and/or graphics can be adjusted based on sensor inputs indicating a gaze direction, focal distance and/or biological metric of the user. A current state of the text can be bookmarked when the user looks away from the image and subsequently resumed from the bookmarked state. A forward facing camera can adjust the text if a real word object passes in front of it, or adjust the appearance of the text based on a color of pattern of a real world background object. In a public speaking or karaoke application, information can be displayed regarding a level of interest of the audience and names of audience members. | 09-11-2014 |
20140253438 | INPUT COMMAND BASED ON HAND GESTURE - Examples disclose a device with a sensor to detect a location of a chassis which does not include an input component for a hand gesture and to execute an input command on the device based on the hand gesture and if the hand gesture is detected at the location of the chassis which does not include the input component. | 09-11-2014 |
20140267000 | Systems and Methods for Automatically Entering Symbols into a String of Symbols Based on an Image of an Object - In one embodiment, a method includes, in connection with a user entering a string of symbols into a graphical user interface (GUI) of a mobile computing device, capturing an image of an object with a sensor on the mobile computing device, automatically identifying the object from the image, automatically determining one or more symbols associated with the object as identified from the image, and automatically entering the symbols into the string of symbols. | 09-18-2014 |
20140267001 | TECHNIQUES FOR AUTOMATED EVALUATION OF 3D VISUAL CONTENT - Various embodiments are generally directed to automatically observing reactions of viewers of 2D and 3D versions of the same visual content, and comparing aspects of those observations for indications of fatigue in the viewing of the 3D version. An apparatus includes a processor element; and logic to compare a first ROI (region of interest) data associated with a first viewer of a first visual presentation of a 2D (two-dimensional) version of visual content to a second ROI data associated with a second viewer of a second visual presentation of a 3D (three-dimensional) version of the visual content; and identify an instance of a statistically significant difference in a region of interest between the first ROI data and the second ROI data for at least one frame of the visual content indicative of an instance of viewing fatigue of the second viewer. Other embodiments are described and claimed. | 09-18-2014 |
20140267002 | PROXIMITY-BASED CONTROL OF MEDIA DEVICES FOR MEDIA PRESENTATIONS - Embodiments relate generally to electrical/electronic hardware, computer software, wired and wireless network communications, portable, wearable, and stationary media devices. RF transceivers and/or audio system in each media device may be used to wirelessly communicate between media devices and allow configuration and other data to be wirelessly transmitted from one media device to another media device. The proximity detection system may be configured to detect a presence of a user or multiple users and upon detecting presence, access content on a user device, and record the content while also playing back the content on the media device. One or more user devices in proximity of the media device post detection may wirelessly communicate with the media device and the media device may orchestrate handling of content from those devices or from a wirelessly accessible location such as the Cloud or Internet. | 09-18-2014 |
20140267003 | WIRELESS CONTROLLER TO NAVIGATE AND ACTIVATE SCREENS ON A MEDICAL DEVICE - A system provides non-contact communication between a controller and a medical device. The control signal may be a wirelessly transmitted control signal, such as a wireless radiofrequency signal and/or an optical or acoustic signal, for example. An interface device is provided that may be a remote controller or switch that may be used by a user, such as a health care practitioner (HCP), in connection with navigating and activating screens of a dialysis machine during a dialysis treatment without requiring the HCP to physically contact the dialysis machine. With the described system, the HCP does not need to re-glove each time a change is made to an on-going dialysis treatment when interfacing with a graphical display of the dialysis machine. | 09-18-2014 |
20140267004 | User Adjustable Gesture Space - A method for adjusting an active area of a sensor's field of view by recognizing a touch-less adjust gesture. The method includes receiving data from a sensor having a field of view. The method also includes performing at least one gesture recognition operation upon receiving data from the sensor. The method additionally includes recognizing an adjust gesture by a user. The adjust gesture is a touch-less gesture performed in the field of view by the user to adjust the active area of the field of view. The method further includes adjusting the active area in response to recognizing the adjust gesture by the user. | 09-18-2014 |
20140267005 | EYE PIECE FOR AUGMENTED AND VIRTUAL REALITY - A wearable computing device comprises one or more one eye pieces each of which further comprises a flexible frame surrounding a display screen and tactile elements arranged on the perimeter of the display screen. The tactile elements provide tactile feedback to the user that is synchronous with the display on the display screen. A detection system is also included in the flexible frame to monitor the movements of a wearer's eyes and the eye sockets and to execute various tasks in response to the detected movements. A visual cortex thought detector also coupled to the wearable computing device obtains information regarding the wearer's thoughts and manipulates a display on the display screen based on the obtained information. | 09-18-2014 |
20140267006 | AUTOMATIC DEVICE DISPLAY ORIENTATION DETECTION - The present disclosure provides techniques for automatically changing device display orientation. A computing device includes a display and an inertial measurement unit (IMU) to detect changes in computing device orientation. The computing device also includes a driver to change an orientation of the display when a change in orientation of the computing device is detected by the IMU. A camera can be used to track user eye position in response to detection of a change in orientation of the computing device by the IMU. | 09-18-2014 |
20140267007 | Interaction Detection Using Structured Light Images - An apparatus and method are provided to determine the occurrence and location of an interaction with an interface, particularly with an image of a user interface that may be projected or otherwise produced on a surface. The apparatus uses one or more single-element sensors (such as a photodiode) to sense and capture light readings of a scene, the readings corresponding to a plurality of structured light images injected within the presentation of the interface. The readings are compared to a baseline set of readings to determine the occurrence and location of an interaction event by an obstacle (i.e., a finger or a stylus) such as a touch event or a movement of the obstacle. | 09-18-2014 |
20140267008 | GESTURE-BASED LOAD CONTROL - A load control system may include load control devices for controlling an amount of power provided to an electrical load. The load control devices may be capable of controlling the amount of power provided to the electrical load based on control instructions received from a gesture-based control device. The gesture-based control device may identify gestures performed by a user for controlling a load control device and provide control instructions to the load control device based on the identified gestures. The gestures may be identified based on images received from a motion capture device. A gesture may be associated with a scene that includes a configuration of one or more load control devices in a load control system. The user may perform one or more gestures to program the gesture-based control device. | 09-18-2014 |
20140267009 | AUTHENTICATING A USER USING HAND GESTURE - The present document describes a method for authenticating a user into a system using gestures. The user may draw the gesture on a touch sensitive device (e.g. touchpad), or make the gesture in the air in front of a camera. In the touchpad embodiment, the trajectory defined by the gesture is received ready from the touchpad. In the camera embodiment, the trajectory is built by analyzing the images of an image stream to find a hand (or another subject i.e. meta-subject). The trajectory may then be built by monitoring the change of position of the hand in the succession of images. The trajectory is analyzed to determine the key-code defined by the gesture, and to determine whether or not it is the authenticated user who is performing the gesture (as opposed to an intruder) based on the speed and the distance between the trajectory and a straight line. | 09-18-2014 |
20140267010 | System and Method for Indicating a Presence of Supplemental Information in Augmented Reality - A method and system are provided for indicating a presence of supplemental information in augmented reality to a user. The method includes capturing a field of view of a camera, obtaining supplemental information for at least one object in the captured field of view, displaying the captured field of view on a display and tracking a point of regard of the user. The point of regard is indicative of an area on the display at which the gaze of the user is focused. The method also includes, for each object associated with supplemental information, displaying, overlaid the captured field of view, a respective indicator that the supplemental information is associated with the object if one or more criteria are satisfied. The one or more criteria are based on at least a proximity between an image of the object in the captured field of view and the point of regard. | 09-18-2014 |
20140267011 | MOBILE DEVICE EVENT CONTROL WITH DIGITAL IMAGES - In one exemplary embodiment, a method includes receiving a digital image from a mobile device. A first element of the digital image and a second element of the digital image is identified. The first element of the digital image includes a user-selected content. The second element includes a depiction of a user action. An event trigger identified by the first element and the second element is determined. The event trigger includes a mobile device operating system command. A mobile device command is generated according to the event trigger. The mobile device command is communicated to a mobile operating system. | 09-18-2014 |
20140267012 | VISUAL GESTURES - Visual gestures in a device allow a user to select and activate features in a display of the device. A visual reference on a physical object is identified. A visualization of a virtual object engaged with an image of the physical object is generated in a display of a device. The virtual object corresponds to the visual reference. A rendering of the visualization of the virtual object is based a position of the display relative to the visual reference. A focus area in the display and a feature of the object are determined. A state of the feature is changed when the feature is in the focus area of the display. | 09-18-2014 |
20140267013 | USER INTERFACE DEVICE PROVIDED WITH SURFACE HAPTIC SENSATIONS - A user interface device includes a housing, a user input element supported by the housing, and a haptic output device supported by the housing. The haptic output device is configured to generate a haptic effect at a surface of the user interface device. The surface is part of the housing and/or the user input element. A processor disposed within the housing. The processor is configured to receive an input command from the user input element, communicate the input command to a host computer, receive an output command from the host computer, and output a haptic signal based on the output command to the haptic output device to generate the haptic effect at the surface. | 09-18-2014 |
20140267014 | Delivery System - A message delivery system includes at least one display portion configured to deliver a predetermined message from a vehicle to other motorists and pedestrians. The message includes positive apologetic messages as well as reasonable requests that scroll in bright neon colored lights in real time and may be displayed so as to diffuse volatile situations and indicate anticipated maneuvers by the vehicle. The message delivery system includes an illuminated display portion, such as, a portable display and a mounted display, that prominently display from a vehicle. The portable display is configured to be gripped and extended from the vehicle for other motorists to view. A message may be entered in the portable display or a control portion, whereby the same message displays in the mounted display also. The control portion also serves to enter a message, which then relays to either, or both displays. | 09-18-2014 |
20140267015 | NON-VOLATILE DISPLAY ACCESSORY CONTROLLED AND POWERED BY A MOBILE DEVICE - A mobile device accessory includes an integrated non-volatile display (NVD) that uses substantially no power in order to maintain an image displayed thereon. A mobile device connected to the accessory controls the NVD by providing both image data for display on the NVD and power for powering of the NVD. In response to receiving a user-input command to display an image on the NVD, the mobile device transiently provides power to the accessory. While power is provided, the mobile device transmits the data for the image to the accessory. Once the image is displayed on the NVD, the mobile device substantially withdraws power provided to the accessory. Upon receiving a user-input command to update an image on the NVD, the mobile device again transiently provides power to the accessory for a limited period of time sufficient to provide and display the updated image on the NVD. | 09-18-2014 |
20140267016 | DIGITAL INTERFACE MEDIA - A device means which provides a substantially planar writing surface that incorporates the same dimensions, dimensional relationships, and aspect ratios as those used by computer monitors, video display screens and other electronic media. The current invention has components that provide for a direct relationship to the header, margins, composition areas, menus, status bars, and other elements used in many software programs. The translucent properties of the current invention provide the capability for the selective viewing of images, graphics, and data on the front surface, the back surface, or a combination of both surfaces; and, to selectively display elements on the front surface, back surface, or combined surfaces through the manipulation of the light source, or sources, and wavelength(s) of light source(s) utilized. | 09-18-2014 |
20140267017 | MULTI-USER DISPLAY SYSTEMS AND METHODS - Image display systems and methods are provided. A plurality of pixel groups, each including a plurality of independent pixels, forms a display. A plurality of lenses ( | 09-18-2014 |
20140267018 | SYSTEM AND METHOD OF GESTURE INTERPRETATION BY BODY PART TRACKING - A system and method of using an imager to capture a series of images of a body part, of using a processor to interpret a gesture or movement(s) of the body part that is detected in the series of images as an instruction to a processor to execute a function on a device. The detected movement may interpreted as insignificant or unintended and may be discounted as unintended noise or otherwise ignored. | 09-18-2014 |
20140267019 | CONTINUOUS DIRECTIONAL INPUT METHOD WITH RELATED SYSTEM AND APPARATUS - A method for continuous directional input may include using a processor and memory to track parameters of a continuous spatial trace of a parametric process, and detect positions along the continuous spatial trace that correspond to selectable directional input events and subdivide the continuous spatial trace into trace segments. The processor and memory may calculate directional characteristics in positions of the selectable directional input events at ends of each trace segment, determine input indexes corresponding to the directional characteristics of each trace segment, and convert a sequence of indexes into assigned input values. | 09-18-2014 |
20140267020 | WRIST TERMINAL DEVICE, COMMUNICATIONS TERMINAL DEVICE, TERMINAL DEVICE, DISPLAY CONTROL METHOD OF TERMINAL DEVICE, AND STORAGE MEDIUM STORING DISPLAY CONTROL PROGRAM - A wrist terminal device is configured to be worn around a wrist of a user. The device includes a motion state detecting unit, a receiving unit, a display unit and a motion-responsive display control unit. The motion state detecting unit detects a motion state of the user. The receiving unit receives mail arrival notification and an e-mail from a communications terminal device. The display unit displays the e-mail received by the receiving unit. The motion-responsive display control unit controls the display unit to display the received e-mail in a display style according to the motion state of the user detected by the motion state detecting unit in response to the receiving unit receiving the mail arrival notification and the e-mail. | 09-18-2014 |
20140267021 | DISPLAY CONTROL METHOD AND APPARATUS - A screen display control method is provided for controlling screen display of an electronic device more efficiently. A screen display control method includes determining a resolution of a camera, recognizing a face of a user using the camera operating at the determined resolution, and controlling screen display of the electronic device based on the face. | 09-18-2014 |
20140267022 | INPUT CONTROL METHOD AND ELECTRONIC DEVICE SUPPORTING THE SAME - An input control method and an electronic device supporting the same are provided. The method includes activating a plurality of input signal collection units supporting a multi-modal input, collecting at least one input signal from the input signal collection units, and outputting feedback information corresponding to the at least one input signal. | 09-18-2014 |
20140267023 | METHOD FOR DISPLAYING DYNAMIC IMAGE AND ELECTRONIC DEVICE THEREFOR - An apparatus and method for displaying images dynamically in an electronic device based on current state information of the electronic device are described. One method for dynamically displaying images in an electronic device includes determining an image conversion weight for each of the images; determining a moving velocity of the electronic device; and displaying each of the images based on the image conversion weight and the moving velocity. | 09-18-2014 |
20140267024 | COMPUTING INTERFACE SYSTEM - Computing interface systems and methods are disclosed. Some implementations include a first accelerometer attached to a first fastening article that is capable of holding the first accelerometer in place on a portion of a thumb of a user. Some implementations may also include a second accelerometer attached to a second fastening article that is capable of holding the second accelerometer in place on a portion of a wrist of a user. Some implementations may additionally or alternatively include magnetometers and/or gyroscopes attached to the first and second fastening articles. Some implementations may also include a processing device configured to receive measurements from the accelerometers, magnetometers, and/or gyroscopes and identify, based on the measurements, symbols associated with motions of a user's hand and/or the orientation of the hand. Some implementations may allow a user to control a cursor in a three dimensional virtual space and interact with objects in that space. | 09-18-2014 |
20140267025 | METHOD AND APPARATUS FOR OPERATING SENSORS OF USER DEVICE - A method of operating a plurality of sensors of a user device includes detecting input using a user input means, measuring a depth value between the user input means and a screen of the user device, activating a gesture recognition function by selectively driving one or more of the plurality of sensors based on the measured depth value, and recognizing a user gesture based on pieces of information collected by the selectively driven sensors. | 09-18-2014 |
20140267026 | HANDHELD DOCUMENT READING DEVICE WITH AUXILIARY DISPLAY - A method to display an electronic document includes receiving an electronic document, displaying a current page of the electronic document on a screen of a handheld device operated by a user, and displaying one or more adjacent pages of the electronic document on an external display. The method may also include changing the current page on the screen of the handheld device and the adjacent pages on the external display in response to user input. In some embodiments, the method includes pinning or unpinning one or more selected page(s) on the external display in response to user input. A corresponding apparatus, system, and computer readable medium are also disclosed herein. | 09-18-2014 |
20140267027 | Vision Protection Method and System Thereof - The vision protection method and system thereof is provided to ensure a viewer to rest his/her eyes after viewing on an electronic device for a certain period, wherein the eyesight protection method includes the steps of initially counting down a predetermined working time of a working mode, switching the working mode to a resting mode to temporarily halt the current working mode of the electronic device, and resuming the operation of the working mode that the viewer was previously working on. In that manner, the viewer is enforced to rest his/her eyes after every certain period. | 09-18-2014 |
20140267028 | IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, PROGRAM, AND IMAGE PROCESSING SYSTEM - There is provided an image processing apparatus including: an input image acquisition unit for obtaining an input image generated by taking an image of a real space; an image recognition unit for recognizing, when a first user-input representing a start of manipulation is detected, a manipulator used for manipulating a virtual object, wherein the manipulator appears in the input image; a calculation unit for calculating, according to a result of the recognition of the manipulator provided by the image recognition unit, a position on a screen of a display device at which the virtual object is to be displayed; a display control unit for displaying the virtual object at the position of the screen of the display device calculated by the calculation unit; and a communication unit for transmitting, when the first user-input is detected, a first notification signal for notifying the start of manipulation to another apparatus displaying the same virtual object. | 09-18-2014 |
20140285416 | Short Range Wireless Powered Ring for User Interaction and Sensing - In general, the short range wireless powered ring described herein pertains to a finger-worn ring. In one embodiment the ring has a small interaction area towards the palm of the hand that employs sensors to sense user input and can interpret this input and other data. For example, the ring can interpret user input as scroll and select input actions. The ring communicates user interactions and other data wirelessly using a low-power wireless solution. The ring contains a coil and other circuitry for energy harvesting from short range wireless enabled devices such as, for example, NFC enabled phones, while users interact with their devices. A built in rechargeable battery is used to store the scavenged energy. The ring may also contain physiological and inertial sensors. The ring can provide a readily available input device of small form factor that has an easily accessible energy source for ease of use. | 09-25-2014 |
20140285417 | Method and Apparatus for Causing a Deformation Representation - A method comprising determining a deformation attribute of an apparatus based, at least in part, on at least one operational parameter, receiving an indication of an acceleration input that corresponds to the deformation attribute, and causing a deformation representation of the apparatus, the deformation representation being indicative of the deformation attribute in relation to the acceleration input is disclosed. | 09-25-2014 |
20140285418 | METHOD AND APPARATUS FOR ENLARGING A DISPLAY AREA - An apparatus may include circuitry configured to determine, as a first determination result based on an output of a sensor, that an instruction object is within a predetermined distance of a surface of a display. The circuitry may acquire an image of an area surrounding the apparatus. The circuitry may detect a presence of a facial feature in the captured image. The circuitry may calculate a line-of-sight angle based on the detected facial feature. The circuitry may control the display to enlarge an area of a displayed interface based on the first determination result and the line-of-sight calculation. | 09-25-2014 |
20140285419 | DISPLAY APPARATUS EQUIPPED WITH MOTION DETECTION SENSOR - Disclosed is a display apparatus which is driven by a motion detection sensor, and more particularly, to a display apparatus in which power, volume, channel or the like are controlled by a motion detection sensor. The display apparatus of the present invention includes a display part which displays an image; a support stand which supports the display part; a power source which supplies power to the display part; a motion detection sensor which is installed at an edge surface of the display part so as to detect an operation signal of an object in a contactless manner; and a main control part which controls the display part based on a detection signal from the motion detection sensor, wherein the motion detection sensor is installed so as to be oriented toward an outer surface of the display part, thereby detecting a motion signal at the outer surface of the display part. According to the present invention, since the frame part such as a bezel of the display part is not needed, the display screen can be expanded by a width of the bezel part. Further, since about 5 to 10% of the outer size of the display part corresponding to the width of the bezel part can be reduced, it is advantageous in the space usage. | 09-25-2014 |
20140285420 | IMAGING DEVICE, DISPLAYING DEVICE, MOBILE TERMINAL DEVICE, AND CAMERA MODULE - An imaging device includes: a radiation unit configured to radiate light with a peak of a specific wavelength; a light receiver configured to have first sensitivity to a first wavelength longer than the specific wavelength, the first sensitivity being lower than second sensitivity to a second wavelength shorter than the specific wavelength; and a filter configured to block the second wavelength. | 09-25-2014 |
20140285421 | DATA COMMUNICATION METHOD VIA TOUCH SURFACE - A data communication method enabling wireless sharing of files or data between electronic devices that can store information and/or generate information by means of touch surface (capacitive or multi-touch), display units (LCD, led, etc.), photo sensor and frame having negative electrical conductivity. | 09-25-2014 |
20140285422 | APPARATUS AND METHOD OF CONTROLLING SCREENS IN A DEVICE - An apparatus and a method of controlling screens in a device are provided. The apparatus includes a display configured to include a first screen and a second screen overlapped over the first screen, and a controller configured to control display of partial information of the first screen hidden by the second screen to be viewable by changing an attribute of the second screen, upon detecting a gesture on the second screen while information is being displayed separately on the first and second screens. | 09-25-2014 |
20140285423 | INFORMATION DISPLAY DEVICE, INFORMATION DISPLAY METHOD, AND STORAGE MEDIUM - While an information display device is initially placed at a standstill, a CPU stores acceleration responsive to gravitational force acquired by an acceleration sensor into a RAM. The CPU acquires acceleration in compliance with fixed timing acquired by the acceleration sensor and resulting from motion on the information display device. The CPU accumulates the acceleration acquired in compliance with the fixed timing within each prescribed period. Then, the CPU makes the accumulated acceleration agree with acceleration responsive to gravitational force stored in the RAM, thereby correcting a posture parameter. Based on the corrected posture parameter, the CPU determines whether a temporal image is to be displayed on a display unit. | 09-25-2014 |
20140285424 | USER INTERFACE SYSTEM - One embodiment of the user interface system comprises: a volume of fluid; a tactile layer; a retaining wall substantially impermeable to the fluid; a permeable layer; a displacement device; and a touch sensor. The tactile layer, with a back surface, defines a second region, operable between: a retracted state, wherein the second region is substantially flush with a first region; and an expanded state, wherein the second region is substantially proud of the first region. The permeable layer, joined to the back surface of the first region, includes a plurality of fluid ports that communicate a portion of the fluid through the permeable layer to the back surface of the second region. The displacement device directs the fluid through the fluid ports to the back surface to transition the second region from the retracted state to the expanded state. The touch sensor detects a user touch on the tactile layer. | 09-25-2014 |
20140285425 | SHAPING DEVICE - According to an embodiment, a shaping device includes an acquiring unit, an extracting unit, first and second calculators, a determining unit, a shaping unit, and a display unit. The acquiring unit is configured to acquire strokes handwritten by a user. The extracting unit is configured to extract multiple combinations of strokes. The first calculator is configured to calculate a first likelihood representing a probability that each combination related to a target graphic. The second calculator is configured to calculate a second likelihood representing a probability that each combination related to an incomplete shape. The determining unit is configured to determine whether there is a first combination having first likelihood not less than a first threshold and the second likelihood not more than a second threshold. The shaping unit is configured to shape the strokes into the target graphic. The display unit is configured to display a result of shaping. | 09-25-2014 |
20140285426 | SIGNAL PROCESSING DEVICE AND SIGNAL PROCESSING METHOD - A signal processing device includes: a memory; and a processor coupled to the memory and configured to: detect a repetition of a feature value from a time series of the feature value corresponding to an input signal, change a recognition condition for recognizing a class for the feature value when the repetition of the feature value is detected. | 09-25-2014 |
20140285427 | SIGNAL PROCESSING DEVICE AND SIGNAL PROCESSING METHOD - A signal processing device includes: a memory; and a processor coupled to the memory and configured to: detect a second feature value relating to a first feature value recognized to satisfy a recognition condition, from a second time series prior to a first time series of the first feature value in a times series of a feature value corresponding to an input signal, and change the recognition condition so that the second feature value is recognized as a class for recognizing the first feature value. | 09-25-2014 |
20140285428 | RESOURCE-RESPONSIVE MOTION CAPTURE - The technology disclosed relates to operating a motion-capture system responsive to available computational resources. In particular, it relates to assessing a level of image acquisition and image-analysis resources available using benchmarking of system components. In response, one or more image acquisition parameters and/or image-analysis parameters are adjusted. Acquisition and/or analysis of image data are then made compliant with the adjusted image acquisition parameters and/or image-analysis parameters. In some implementations, image acquisition parameters include frame resolution and frame capture rate and image-analysis parameters include analysis algorithm and analysis density. | 09-25-2014 |
20140285429 | Light Management for Image and Data Control - A light control and display technology applicable to light redirection and projection with the capacity, in a number of embodiments modified for particular applications, to produce managed light, including advanced images. Applications include miniature to very large scale video displays, optical data processing, 3-dimensional imaging, and lens-less vision enhancement for poor night-driving vision, cataracts and macular degeneration. | 09-25-2014 |
20140285430 | INFORMATION PROCESSING METHOD AND ELECTRONIC DEVICE - The present disclosure provides an information processing method and an electronic device. The information processing method is applied in an electronic device comprising or being connected to a display unit. The method comprises: acquiring a first target object containing a first display image and a second display image different from the first display image; displaying the first display image based on a first display parameter and displaying the second display image based on a second display parameter different from the first display parameter, such that a first user using the electronic device perceives a first distance value larger than zero, the first distance value indicating a distance between a plane where a display position of the first target object is located and a plane where the display unit is located; detecting user location information corresponding to the first user; and determining a second distance value between the first user and the first target object based on the user location information. In this way, the distance between the user and the first target object as perceived by the user can be determined more accurately. | 09-25-2014 |
20140285431 | METHOD AND APPARATUS FOR PROCESSING AN IMAGE BASED ON DETECTED INFORMATION - A method and apparatus for processing an image based on detected information are provided. The method includes obtaining information relating to properties of the image, property information related to the image; obtaining ambient environment information of a display device for displaying the image; and processing the image based on the property information and the ambient environment information. | 09-25-2014 |
20140285432 | Operating Room Environment - A system that serves to facilitate communication between members of an operating team in an operating room. Among others the system supports operation of different devices and expression of different command by gazing at a proper area on the monitor. | 09-25-2014 |
20140285433 | METHOD FOR CONTROLLING PORTABLE DEVICE EQUIPPED WITH FLEXIBLE DISPLAY, AND PORTABLE DEVICE USING THE METHOD - The present disclosure relates to a method for controlling a portable device equipped with a flexible display, and a portable device using the method, and more particularly, to a method for controlling a portable device based on whether an edge portion of a flexible display is gripped, and a portable device using the method. The method includes detecting bending of the flexible display. The method also includes determining whether an edge portion of the flexible display is gripped. The method further includes performing a control operation of the portable device corresponding to the bending when the edge portion is gripped. | 09-25-2014 |
20140285434 | Methods and Systems for Gesture Classification in 3D Pointing Devices - Systems and methods according to the present invention provide the ability for a system to realize when a handheld device is performing a gesture and to execute the associated command. | 09-25-2014 |
20140285435 | MOVEMENT RECOGNITION AS INPUT MECHANISM - The detection of relative motion or orientation between a user and a computing device can be used to control aspects of the device. For example, the computing device can include an imaging element and software for locating positions, shapes, separations, and/or other aspects of a user's facial features relative to the device, such that an orientation of the device relative to the user can be determined. A user then can provide input to the device by performing actions such as tilting the device, moving the user's head, making a facial expression, or otherwise altering an orientation of at least one aspect of the user with respect to the device. Such an approach can be used in addition to, or as an alternative to, conventional input devices such as keypads and touch screens. | 09-25-2014 |
20140285436 | Vision Protection Method and System Thereof - A vision protection method is provided to ensure a viewer to rest his/her eyes after viewing on an electronic device for a certain period, wherein the eyesight protection method includes the steps of detecting at least one of eye activities of the viewer and working parameter of the electronic device in a working mode of the electronic device during the viewer is working on a current work displaying by the electronic device; switching the working mode of the electronic device to a resting mode when an abnormal eye activity of the viewer is detected; and switching the electronic device from the resting mode back to the working mode to resume the display of the current work of the electronic device. Therefore, the viewer is enforced to rest his/her eyes after every certain period. | 09-25-2014 |
20140292635 | EXPECTED USER RESPONSE - An apparatus comprising at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following: detect an indication that a user is available for interaction, responsive to detecting the indication, provide a haptic output pattern associated with an expected user response, detect a user input, wherein the user input is responsive to the haptic output pattern, compare the expected user response and the user input, and based on the said comparison, perform an action. | 10-02-2014 |
20140292636 | Head-Worn Infrared-Based Mobile User-Interface - Methods and apparatuses for device user interfaces are disclosed. In one example, a head-worn apparatus includes a processor, a wireless communications transceiver, and an infrared camera configured to detect an infrared light associated with a movement of a user hand and provide an infrared camera output. The head-worn apparatus includes a sensor unit configured to detect motion of a user head and provide a sensor unit output. The head-worn apparatus further includes a memory storing an application configured to receive the infrared camera output and the sensor unit output to identify a user action from the movement of the user hand. | 10-02-2014 |
20140292637 | METHOD FOR ADJUSTING HEAD-MOUNTED DISPLAY ADAPTIVELY AND HEAD-MOUNTED DISPLAY - A method for adjusting head mounted display adaptively and a head mounted display are provided. The method includes the following steps. Eye state parameters of a user wearing the head-mounted display are sensed by using a first sensing unit, and whether the user's eyes are discomfort or not is determined according to the eye state parameters. If yes, environmental parameters of the user's location are sensed by using a second sensing unit. The eyes state parameters and the environmental parameters are analyzed synthetically, such that the projection display setting of the head-mounted display could be adjusted adaptively. | 10-02-2014 |
20140292638 | COMPUTING SYSTEM AND METHOD FOR AUTOMATICALLY DETECTING FATIGUE STATUS OF USER - Disclosed herein are a computing system and method for automatically detecting the fatigue status of a user. The computing system includes a display unit, a storage unit, an image-capturing unit and a processing unit. The storage unit is configured to store an operating system and an application. The image-capturing unit is configured to capture a facial image of the user. The processing unit is configured to execute the application in the operating system. The application is configured to analyze the change of the color of the user's eye in the facial image so as to determine whether the user is in a fatigue state. When it is determined that the user is in the fatigue state, the application prompts the display unit to present a warning window. | 10-02-2014 |
20140292639 | MULTI-DISTANCE, MULTI-MODAL NATURAL USER INTERACTION WITH COMPUTING DEVICES - Systems and methods may provide for receiving a short range signal from a sensor that is collocated with a short range display and using the short range signal to detect a user interaction. Additionally, a display response may be controlled with respect to a long range display based on the user interaction. in one example, the user interaction includes one or more of an eye gaze, a hand gesture, a face gesture, a head position or a voice command, that indicates one or more of a switch between the short range display and the long range display, a drag and drop operation, a highlight operation, a click operation or a typing operation. | 10-02-2014 |
20140292640 | COMPUTER READABLE MEDIUM HAVING PROGRAM RECORDED THEREIN, INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING SYSTEM - An example apparatus, which includes a predetermined input device for outputting operational data indicating operational inputs entered by a player and a display unit that displays predetermined characters in a virtual space, moves a first character in the virtual space based on the operational data, moves a second character in the virtual space based on automatic operating data which is a set of operational data for continuously moving a character, determines whether the first character satisfies a predetermined condition or not, and moves the second character from a state in which the second character satisfies the predetermined condition based on the automatic operating data in response to the first character satisfying the predetermined condition. | 10-02-2014 |
20140292641 | DISPLAY DEVICE AND CONTROL METHOD THEREOF - A display device is disclosed. A display device according to an embodiment of the present specification includes a display unit configured to display visual information, a camera unit configured to capture an image in front of the display device, a sensor unit configured to sense user input applied to the display device, and a control unit configured to control the display device, wherein the control unit detects at least one user from the captured image, maintains display of the visual information and processes received user input when the detected user includes a predetermined master user, the control unit detects at least one user from the captured image, maintains display of the visual information and does not process the received user input when the detected user does not include the predetermined master user, and the control unit deactivates the display unit when no user is detected in the captured image. | 10-02-2014 |
20140292642 | METHOD AND DEVICE FOR DETERMINING AND REPRODUCING VIRTUAL, LOCATION-BASED INFORMATION FOR A REGION OF SPACE - The present invention relates to a method for determining and reproducing virtual, location-based information for a region of space, comprising the steps of:
| 10-02-2014 |
20140292643 | DISPLAY APPARATUS AND REMOTE CONTROL APPARATUS FOR CONTROLLING THE DISPLAY APPARATUS - A remote control apparatus which controls a display apparatus is provided. The remote control apparatus includes a body having a roly-poly shape, a sensor which senses a motion of the body, and a controller which transmits a signal which corresponds to a sensed result of the sensor and controls operations of the display apparatus. The controller determines a user interaction according to a sensed result of the sensor, and reads from a storage a control command which corresponds to the user interaction and transmits the read control command to the display apparatus. The body has its center of mass in its lower portion and the lower portion is heavier than its upper portion. | 10-02-2014 |
20140292644 | IMAGE PROCESSING DEVICE AND METHOD - An image processing device includes a processor a processor; and a memory which stores a plurality of instructions, which when executed by the processor, cause the processor to execute: detecting a first candidate group and a second candidate group contained in the image, the first candidate group being one or more candidates for a first part of a user, the second candidate group being one or more candidates for a second part of the user; and selecting at least one of the first part in the first candidate group and the second part in the second candidate group, on the basis of a human-body's positional relationship of the first part and the second part. | 10-02-2014 |
20140292645 | DISPLAY CONTROL DEVICE, DISPLAY CONTROL METHOD, AND RECORDING MEDIUM - There is provided a display control device including a display controller configured to place a virtual object within an augmented reality space corresponding to a real space in accordance with a recognition result of a real object shown in an image captured by an imaging part, and an operation acquisition part configured to acquire a user operation. When the user operation is a first operation, the display controller causes the virtual object to move within the augmented reality space. | 10-02-2014 |
20140292646 | INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - According to the present disclosure, there is provided an information processing apparatus, including a first housing, a second housing, having a display section which displays information, in which the display section is reversed between a first position and a second position, a connection section which rotatably connects the second housing with respect to the first housing, a posture detection section which detects any of the four postures of a first posture, a second posture, a third posture, and a fourth posture, and a display control section which controls a display state of the display section in accordance with a detection result of the posture detection section. | 10-02-2014 |
20140292647 | INTERACTIVE PROJECTOR - An interactive projector includes: a base placed on a projection surface; a leg configured to raise with tilting from the base towards the projection surface; a projecting unit, mounted on the leg on the side of a projection surface, configured to project first image; a reflector, provided at a free-end portion of the leg, configured to reflect the image projected from the projecting unit towards the projection surface; and an image pickup unit, mounted on the reflector, configured to photograph the projection surface and output second image to an electronic device that outputs, to the projecting unit, the first image that is generated based on the second image. | 10-02-2014 |
20140292648 | INFORMATION OPERATION DISPLAY SYSTEM, DISPLAY PROGRAM, AND DISPLAY METHOD - An information operation display system includes a camera, a projector, and an information processing apparatus. The information processing apparatus includes an acquisition unit configured to acquire an image taken by the camera, a measurement unit configured to measure a 3-dimensional coordinate position of an operation object included in the image acquired by the acquisition unit, and a display unit configured to control the projector such that an image indicating a point on which a selection operation is performed by the operation object is displayed on an operation target object according to the 3-dimensional coordinate position of the operation object measured by the measurement unit. | 10-02-2014 |
20140292649 | METHOD AND DEVICE FOR SWITCHING TASKS - Provided are a task switching method capable of rapidly and easily accessing a task of interest by using a button that may receive a touch-based input and a device for executing the task switching method. The device includes a button configured to receive an input; a display configured to display task switching screens; and a processor configured to set a task switching mode in response to the button receiving a first input, and control the display to display the task switching screens in response to the button receiving a second input. | 10-02-2014 |
20140292650 | OPTICAL DETECTION OF BENDING MOTIONS OF A FLEXIBLE DISPLAY - A detection device of a flexible display, the detection device including image sensors configured to capture images, a processor configured to process the images captured by the image sensors, and a memory having instructions stored therein that, when executed by the processor, result in calculation of a bend angle of the flexible display by comparing the images captured at differing times over time. | 10-02-2014 |
20140292651 | Electronic Devices in Local Interactions between Users - In one implementation, a method includes detecting, using a processor, user input through a camera lens. The method further includes determining, using the processor, that an identity of a user, selected from identities of at least two users, is associated with the user input. The method also includes tracking, using the processor, a local interaction between the at least two users based on at least the identity, the user input, and stored rules that govern the local interaction. The tracking can include determining whether the user has complied with the stored rules that govern the local interaction. Furthermore, the local interaction can include a multiplayer game. | 10-02-2014 |
20140292652 | VEHICLE OPERATING DEVICE - A vehicle operating device with favorable operability is provided, the operating device having improved recognition accuracy of gestural operation conducted while holding a steering wheel. | 10-02-2014 |
20140300531 | INDICATOR INPUT DEVICE WITH IMAGE RECOGNITION FUNCTION - An indicator input device with image recognition function is used to control the indicator movement displayed on a host. The device includes a body, a displacement sensing unit, an image recognition unit, and a control unit. The displacement sensing unit detects the body movement. The image recognition unit is installed on the body. The control unit is coupled to the displacement sensing unit and the image recognition unit, and generates an indicator control signal to control the movement of the indicator according to the detection result of the displacement sensing unit, and generates an operation instruction corresponding to a body action according to the image recognition result of the image recognition unit. | 10-09-2014 |
20140300532 | APPARATUS, METHOD AND COMPUTER PROGRAM FOR CONTROLLING A NEAR-EYE DISPLAY - An apparatus, method and computer program where the apparatus comprises: at least one memory configured to store a computer program comprising computer program instructions; and at least one processor configured to execute the computer program instructions to cause the apparatus at least to perform: obtaining, from at least one detector, a detection of at least one bio-signal from at least one user where the user is using a user output device; determining from the at least one obtained bio-signal that movement of the user's head is about to occur; and in response to determining that movement of the user's head is about to occur, enabling the processor to control the output provided by the user output device to coordinate with the movement of the user's head. | 10-09-2014 |
20140300533 | PORTABLE DEVICE PROVIDING A REFLECTION IMAGE AND METHOD OF CONTROLLING THE SAME - According to one embodiment of the present specification, if a notification is generated, a method of controlling a portable device includes the steps of sensing a state of the portable device using a sensor unit, if the portable device is in a first state, displaying content of the notification in a first display area or a second display area of a display unit and if the portable device is in a second state, displaying the content of the notification in the second display area of the display unit in a manner of reversing a left and right of the content of the notification. | 10-09-2014 |
20140300534 | INPUT DEVICE OF ELECTRONIC DEVICE AND SETTING METHOD THEREOF - An input device of an electronic device and a setting method thereof are provided. In the setting method, a setting path is displayed on a display unit of the electronic device, and the input device is set according to the setting path depicted by the input device. | 10-09-2014 |
20140300535 | METHOD AND ELECTRONIC DEVICE FOR IMPROVING PERFORMANCE OF NON-CONTACT TYPE RECOGNITION FUNCTION - A method for improving the performance of recognition by increasing the rate of recognition include executing a specific application, acquiring a user image from a camera while the specific application is executed, collecting a candidate group of non-contact type recognition patterns for an execution screen of the specific application by recognizing at least one of a user's face, eye and hand from the user image, determining an effective value from the collected candidate group of non-contact type recognition patterns, and updating a predefined default value by applying the determined effective value to the predefined default value. An electronic device for improving the performance of recognition and other embodiments also are disclosed. | 10-09-2014 |
20140300536 | STEREOSCOPIC IMAGE DISPLAY DEVICE AND EYE-TRACKING METHOD THEREOF - Discussed are a stereoscopic image display device to implement high-speed eye-tracking techniques and an eye-tracking method. The stereoscopic image display in one embodiment includes an image panel to alternately display left-eye and right-eye images, a switchable panel disposed at the front or rear side of the image panel to separate the left-eye and the right-eye images from each other to correspond to the left and right eyes of a viewer, a camera mounted to the image panel to capture an image of the viewer, and a computer system to detect position information of the viewer from the image input and calculate midpoint information between the detected position information by interpolation using the detected position information to update position information of the viewer at a faster drive frequency than a drive frequency of the camera, and to control driving of the switchable panel in response to the updated position information. | 10-09-2014 |
20140300537 | Device Relay Control System and Method - A device relay control synchronizes a primary control display device with one or more other display devices connected over a network. The system operates independently, without the need of specialized hardware, and requires only a network connection. The control facilitates synchronization, for example, between a teacher's manual and a student book in a digitized teaching environment, allowing a teacher to facilitate a page turn event on a student book without diverting attention from the teacher's manual. | 10-09-2014 |
20140300538 | METHOD FOR GAZE TRACKING - A method for gaze tracking achieves high performances at the same time requiring both limited processor engagement and reduced power, so as to be particularly but not exclusively fit for mobile uses is described. The method includes the steps of: obtaining a digital video stream of a face through a camera, wherein eyes or pupils are identified in corresponding boxes in the spatial domain, the size thereof being function of the face position and orientation relative to said camera, the content of the boxes being the input for the further calculations; transferring the content of the boxes to the frequency domain; applying to the boxes transferred to the frequency domain one or more sparse zones, covering together a fraction of the boxed area transferred to the frequency domain, and a filtering kernel, at least partially overlapping the sparse zones; performing a multiplication between the frequency data within each sparse zone and the kernel, combining the results in a single value for each sparse zone; and repeating the above steps obtaining for each frame said single value, fluctuation of the single value being representative of gaze direction changes along time. | 10-09-2014 |
20140300539 | GESTURE RECOGNITION USING DEPTH IMAGES - Methods, apparatuses, and articles associated with gesture recognition using depth images are disclosed herein. In various embodiments, an apparatus may include a face detection engine configured to determine whether a face is present in one or more gray images of respective image frames generated by a depth camera, and a hand tracking engine configured to track a hand in one or more depth images generated by the depth camera. The apparatus may further include a feature extraction and gesture inference engine configured to extract features based on results of the tracking by the hand tracking engine, and infer a hand gesture based at least in part on the extracted features. Other embodiments may also be disclosed and claimed. | 10-09-2014 |
20140300540 | Gesture-Based Device - The use of gestures on a user device to interact with and obtain functionality from remote communication hub devices is disclosed. A user device includes an accelerometer or other motion detection device and a wireless communications capability. The device is small enough to be placed conveniently about the person and communicates with a communication hub device to perform context-dependent actions. A user makes a gestural command to the device. The device interprets the gesture and undertakes activity. The gestural sensing device may be a motion switch, a multi-axis accelerometer, a video camera, a variable capacitance device, a magnetic field sensor, an electrical field sensor etc. When combined with business logic running on a remote computer system, the device can be used to implement a range of applications and services. | 10-09-2014 |
20140300541 | PORTABLE DEVICE - A portable device, e.g. a laptop, includes a first part (110), e.g. a base element, and a second part ( | 10-09-2014 |
20140306874 | NEAR-PLANE SEGMENTATION USING PULSED LIGHT SOURCE - Methods for recognizing gestures within a near-field environment are described. In some embodiments, a mobile device, such as a head-mounted display device (HMD), may capture a first image of an environment while illuminating the environment using an IR light source with a first range (e.g., due to the exponential decay of light intensity) and capture a second image of the environment without illumination. The mobile device may generate a difference image based on the first image and the second image in order to eliminate background noise due to other sources of IR light within the environment (e.g., due to sunlight or artificial light sources). In some cases, object and gesture recognition techniques may be applied to the difference image in order to detect the performance of hand and/or finger gestures by an end user of the mobile device within a near-field environment of the mobile device. | 10-16-2014 |
20140306875 | INTERACTIVE INPUT SYSTEM AND METHOD - A method for three-dimensional (3D) sensing. The method includes obtaining a first two-dimensional (2D) skeleton of an object, obtaining a second 2D skeleton of the object different from the first 2D skeleton, and calculating a 3D skeleton of the object based on the first and second 2D skeletons. | 10-16-2014 |
20140306876 | MOBILE DEVICE AND METHOD OF CHANGING A SHAPE OF A MOBILE DEVICE - A mobile device includes a flexible display panel, a supporting member at a back side of the flexible display panel, the supporting member supporting the flexible display panel, and an electro active polymer partially inserted in the supporting member, a shape of the electro active polymer being changed based on an input value inputted by a user. | 10-16-2014 |
20140306877 | Gesture Based Interface System and Method - A user interface apparatus for controlling any kind of a device. Images obtained by an image sensor in a region adjacent to the device are input to a gesture recognition system which analyzes images obtained by the image sensor to identify one or more gestures. A message decision maker generates a message based upon an identified gesture and a recognition mode of the gesture recognition system. The recognition mode is changed under one or more various conditions. | 10-16-2014 |
20140306878 | NEAR DISPLAY AND IMAGING - A method and device for eye gaze tracking has coincident display and imaging channel with shared field of view; a pupil forming/collimating subsystem that is part of the imaging channel; and a microdisplay that is part of the imaging channel. The method and device enable simultaneous display and eye tracking by using a pond of mirrors in a micromirror array. | 10-16-2014 |
20140306879 | 3D DISPLAY DEVICE - Two-parallax autostereoscopic display device using eye tracking has a problem in which the brightness is varied depending on the direction of viewing angle. A 3D display device is provided with a detection part for recognizing a position of eyes from an image taken by a camera; a separation mechanism which enables a 3D image to be regenerated at the optimum position for eyes based on information on the position of eyes detected by the detection part; a display device for displaying a plurality of different parallax images at the same time; a backlight attached to the display device; and a backlight control part for controlling the backlight, wherein the backlight control part determines brightness of the backlight in accordance with the position of eyes determined by the detection part. | 10-16-2014 |
20140306880 | METHOD AND CONTROL DEVICE TO OPERATE A MEDICAL DEVICE IN A STERILE ENVIRONMENT - In a method and an apparatus for operating a controlled device in a sterile environment, a receiver detects contact-free user inputs respectively made by different users. A first operating mode of the receiver is activated after detection of an arbitrary contact-free user input of any of said users. The receiver is switched from the first operating to a second operating mode after detecting a predetermined contact-free user input. Upon switching from the first operating mode into the second operating mode, the predetermined contact-free user input can be made only by the user who last made a contact-free user input in the first operating mode. An additional operating mode can be activated under predetermined conditions. | 10-16-2014 |
20140306881 | WEARABLE DEVICE, PROGRAM AND DISPLAY CONTROLLING METHOD OF WEARABLE DEVICE - A wearable device configured to be worn by a wearer, the wearable device includes a display to provide an image on a display area that occupies a part of a field of view of the wearer, an eyesight sensor to acquire eyesight sensing information by an eyesight sensing in a field of view of the eyesight sensor, a peripheral sensing acquisition section to acquire peripheral sensing information from a peripheral sensing having a detection angle larger than an angle of the field of view of the eyesight sensor, and a display controller to control displaying object information of an object in the image, the object being selected from at least one candidate object recognized by at least one of the eyesight sensing and the peripheral sensing. | 10-16-2014 |
20140306882 | SYSTEMS AND METHODS OF EYE TRACKING DATA ANALYSIS - Methods and systems to facilitate eye tracking data analysis are provided. Point of regard information from a first client device of a first user is received, where the point of regard information is determined by the first client device by detecting one or more eye features associated with an eye of the first user. The point of regard information is stored. A request to access the point of regard information is received, and the point of regard information is sent in response to the request, where the point of regard information is used in a subsequent operation. | 10-16-2014 |
20140306883 | IMAGE MANIPULATION BASED ON TRACKED EYE MOVEMENT - The disclosure relates to controlling and manipulating an image of an object on a display device based on tracked eye movements of an observer. When the object is displayed according to an initial view, the observer's eye movement is tracked and processed in order to determine the focus of the observer's attention or gaze on the image. Thereafter, the displayed image is modified to provide a better view of the part of the object in which the observer is most interested. This is accomplished by modifying at least one of the spatial positioning of the object within the viewing area, the angle of view of the object, and the viewing direction of the object. | 10-16-2014 |
20140306884 | INFORMATION PROCESSING APPARATUS, SYSTEM, AND METHOD THEREOF - An information processing apparatus includes a bio-information obtaining unit configured to obtain bio-information of a subject; a kinetic-information obtaining unit configured to obtain kinetic information of the subject; and a control unit configured to determine an expression or movement of an avatar on the basis of the bio-information obtained by the bio-information obtaining unit and the kinetic information obtained by the kinetic-information obtaining unit and to perform a control operation so that the avatar with the determined expression or movement is displayed. | 10-16-2014 |
20140306885 | APPARATUS AND METHOD FOR CONTROLLING PORTABLE TERMINAL - Provided is an apparatus and method for controlling a portable terminal. The apparatus includes a contact sensing unit which senses an area of an external surface of the portable terminal contacted by a user as the user holds the portable terminal, a recognizing unit which recognizes a function mode of the portable terminal based on information about the contacted area sensed by the contact sensing unit, and a control unit which changes the portable terminal to a function mode recognized by the recognizing unit. Since a function mode of the portable terminal is controlled according to the way a user holds the portable terminal, convenience of changing a function mode of the portable terminal is provided through a single manipulation. | 10-16-2014 |
20140306886 | IMAGE PROCESSING DEVICE, METHOD FOR CONTROLLING IMAGE PROCESSING DEVICE, PROGRAM, AND INFORMATION RECORDING MEDIUM - An image processing device includes an operation time information obtaining unit, a movement control unit, a movement target position determination unit and a movement manner determination unit. The operation time information obtaining unit obtains information on a period of time needed for a designation operation for designating a partial area in a screen. The movement control unit moves a virtual camera and/or an operation target object so as to approach a focus area in a virtual space displayed in the partial area. The movement target position determination unit determines a movement target position, based on a position in the virtual space, of the partial area and the size of the partial area. The movement manner determination unit determines a movement manner in the case of moving the virtual camera and/or the operation target object toward the movement target position, based on the period of time needed for the designation operation. | 10-16-2014 |
20140306887 | CONTROL DEVICE AND REMOTE CONTROL DEVICE - A control device includes a storage device for prestoring rendering command data, a central processing device, and a rendering processing device. The rendering command data is data for generating a rendering command executed in order to generate display data indicating a display image to be displayed on a display device. The central processing device refers to a screen management table when a predetermined rendering condition is satisfied, and specifies and outputs rendering command specifying data corresponding to the satisfied rendering condition. The rendering processing device acquires the rendering command data specified by the outputted rendering command specifying data from storage device, generates display data based on the acquired rendering command data, and outputs the display data to the display device. | 10-16-2014 |
20140313117 | CAMOUFLAGED CONNECTED HOME CONTROLLER - A home control system comprises a control apparatus for controlling devices in a home. A user interface is mountable to a wall in the home and is operatively connected to the control apparatus. The user interface comprises a display screen for displaying images relating to operation of the controlled devices. A memory stores an image of the wall proximate the user interface device. A control apparatus and the user interface selectively display the image of the wall on the display screen to camouflage the user interface. | 10-23-2014 |
20140313118 | COMMUNICATION APPARATUS - An apparatus includes a housing, a band which is provided on side of the housing and has a longer length than a circumference of the housing, and a supporter configured to support the band on the side of the housing to enable the band to slide along the side of the housing. The apparatus may include a detector configured to detect a form of the band; a display configured to display a user interface for operating the communication apparatus and a controller configured to control a display position of the user interface based on the form of the band. The detector may detect pressure applied by the band on the housing, and the controller may estimate the form of the band on the basis of the pressure and to control the display position of the user interface on the basis of the estimation result. | 10-23-2014 |
20140313119 | PORTABLE DEVICE INCLUDING INDEX DISPLAY REGION AND METHOD FOR CONTROLLING THE SAME - A portable device including an index display region and a method for controlling the same are disclosed. A method for controlling a portable device includes displaying a first layer from among a plurality of layers on a first display region located at a front surface of the portable device; displaying an index of the plurality of layers on a second display region located at a lateral surface of the portable device; determining a user-viewed display region from among the first display region and the second display region; detecting a first control input to the first display region; and controlling the first layer displayed on the first display region in response to the first control input when a user gazes at the first display region, or controlling the index displayed on the second display region in response to the first control input when the user gazes at the second display region. | 10-23-2014 |
20140313120 | EYE TRACKING BASED SELECTIVELY BACKLIGHTING A DISPLAY - Systems, apparatus, articles, and methods are described including operations for eye tracking based selective backlighting of a display. | 10-23-2014 |
20140313121 | EYEGLASSES ATTACHED WITH PROJECTOR AND METHOD OF CONTROLLING THE SAME - Eyeglasses including a lens unit including a pair of lenses and a frame, a supporting unit supporting the lens unit, a camera mounted on the frame, a projector mounted on the frame and configured to project content on a screen, and a processor configured to control the camera and the projector. A method of controlling eyeglasses, including generating a first projector image to be projected on a screen by a projector mounted on the eyeglasses, obtaining a camera image from a camera mounted on the eyeglasses, and generating a first user input signal by analyzing the camera image. | 10-23-2014 |
20140313122 | SYSTEMS AND METHODS FOR ENABLING GESTURE CONTROL BASED ON DETECTION OF OCCLUSION PATTERNS - Described is an approach to enabling gesture interactions for the viewport widget in a graphical user interface (GUI) library. The gesture interactions may include continuous operations such as panning, zooming and rotating of the viewport's content with fingers (or styluses). The approach is based on using a camera to detect occlusion patterns in a sensor grid rendered over the viewport. The sensor grid consists of sensor blobs, which are small blobs of pixels with a distinct color. A sensor blob is aware of its location in both the viewport's coordinate system and the camera's coordinate system, and triggers an occlusion event at the location when it is occluded by a finger (or stylus). Robust techniques are devised to eliminate unintentional gestures, provide visual guidance and feedback for interactions, and minimize the visual interference of the sensor grid with the viewport's content. | 10-23-2014 |
20140313123 | PROJECTOR FOR PROJECTING PATTERN FOR MOTION RECOGNITION AND MOTION RECOGNITION APPARATUS AND METHOD USING THE SAME - Provided are a projector that projects a pattern for user motion recognition and a motion recognition apparatus and method using the projector. The projector includes a light generation unit, a light guide unit configured to guide light generated from the light generation unit in a predetermined direction, a collimating lens configured to collimate the light transmitted from the light guide unit, and a diffractive optical element (DOE) configured to generate the pattern using the light passing through the collimating lens. | 10-23-2014 |
20140313124 | METHOD AND APPARATUS FOR TRACKING USER'S GAZE POINT USING MOBILE TERMINAL - A method for tracking a user's gaze point using a mobile terminal is provided. The method includes detecting the user's eye area from a camera that is mounted in the mobile terminal and selecting reference points of the eye area, detecting a pupil corresponding to an interest area that is obtained from the reference points, generating a first virtual grid of the eye area using a pickup image of a gaze point of the pupil corresponding to each corner of the display and generating a second virtual grid having the same division areas as those of the first virtual grid in the display, and mapping a position of the pupil within the second virtual grid area corresponding to a position of the pupil within the first virtual grid. | 10-23-2014 |
20140313125 | MORE USEFUL MAN MACHINE INTERFACES AND APPLICATIONS - A method for enhancing a well-being of a small child or baby utilizes at least one TV camera positioned to observe one or more points on the child or an object associated with the child. Signals from the TV camera are outputted to a computer, which analyzes the output signals to determine a position or movement of the child or child associated object. The determined position or movement is then compared to preprogrammed criteria in the computer to determine a correlation or importance, and thereby to provide data to the child. | 10-23-2014 |
20140313126 | Cushioned User Interface Or Control Device - A user interface or control includes a cushion-type support member and a user input member that is interconnected with and carried by the support member. The support member defines an upwardly facing recess, and the input member may be a user interface or control device that is contained within the upwardly facing recess. The support member may be formed to surround the recess about the user interface or control device. The support member may include an air vent that vents air exhausted from the user interface or control device. The user interface or control device may be a laptop computer having a body including a keyboard contained within the recess, and a screen carried by the body. The user interface or control device may alternatively be an electronic input member having an upwardly facing screen, a convertible input member movably mounted to a mounting member, or a game controller. | 10-23-2014 |
20140313127 | Method for Calling Application Object and Mobile Terminal - A method for calling an application object and a mobile terminal are provided that are used to call an application object quickly and improve user experience. An embodiment of the present invention includes: acquiring a motion state and an inclination angle of a mobile terminal; and when the motion state and the inclination angle of the mobile terminal meet a preset condition, calling an application object, where the application object includes at least one of application setting, operating parameter setting, and an application shortcut. | 10-23-2014 |
20140313128 | WRIST-WORN ELECTRONIC DEVICE AND METHODS THEREFOR - Embodiments of electronic wristwatches are disclosed. According to one embodiment, an electronic wristband can provide additional electrical circuitry or devices that can be made available for use as or with an electronic device. In one embodiment, the electronic device can be a mobile electronic device that can be removably coupled to the electronic wristband which provides the additional circuitry or devices. Advantageously, the electronic device can utilize the additional electrical circuitry or devices provided within the electronic wristband to augment the capabilities of the electronic device. In another embodiment, the electronic device can be integrally formed with the electronic wristband which provides the additional circuitry or devices. | 10-23-2014 |
20140313129 | INTELLIGENT USER MODE SELECTION IN AN EYE-TRACKING SYSTEM - A personal computer system comprises a visual display, an imaging device adapted to provide eye-tracking data by imaging at least one eye of a viewer of the visual display, and identifying means for recognizing the viewer with reference to one of a plurality of predefined personal profiles. The personal computer system further comprises an eye-tracking processor for processing the eye-tracking data. According to the invention, the eye-tracking processor is selectively operable in one of a plurality of personalized active sub-modes associated with said personal profiles. The sub-modes may differ with regard to eye-tracking related or power-management related settings. Further, the identifying means may sense an identified viewer's actual viewing condition (e.g., use of viewing aids or wearing of garments), wherein the imaging device is further operable in a sub profile mode associated with the determined actual viewing condition. | 10-23-2014 |
20140320387 | Device, System and Method for Generating Display Data - A method for use in a device is described. The method comprising the steps of detecting a gesture at the device; and outputting data, to a remote device, for controlling a display object on a portion of a display generated by the remote device, wherein the display object is controlled in response to the detected gesture performed at the device, and wherein the data pertains to an application executable on the device. A device is also described. | 10-30-2014 |
20140320388 | STREAMING K-MEANS COMPUTATIONS - A set of population data that includes a plurality of individual population data entities is obtained. Each of the individual population data entities in the obtained set is streamed to an array of a plurality of evaluation functions. The evaluation functions are configured to evaluate each entity to determine an acceptability of the entity for a current state of a candidate centroid value associated with the evaluation function. Acceptance of input data entities is terminated after a first accepting one of the evaluation functions accepts an entity, based on the determined acceptability and on a predetermined priority ordering of acceptance. The first accepting one of the evaluation functions, in the priority ordering, incorporates population data associated with the accepted entity into an aggregator that is local to the first accepting evaluation function. | 10-30-2014 |
20140320389 | MIXED REALITY INTERACTIONS - Embodiments that relate to interacting with a physical object in a mixed reality environment via a head-mounted display are disclosed. In one embodiment a mixed reality interaction program identifies an object based on an image from captured by the display. An interaction context for the object is determined based on an aspect of the mixed reality environment. A profile for the physical object is queried to determine interaction modes for the object. A selected interaction mode is programmatically selected based on the interaction context. A user input directed at the object is received via the display and interpreted to correspond to a virtual action based on the selected interaction mode. The virtual action is executed with respect to a virtual object associated with the physical object to modify an appearance of the virtual object. The modified virtual object is then displayed via the display. | 10-30-2014 |
20140320390 | OBJECT AND MOVEMENT DETECTION - Motions, positions or configurations off, for example a human hand can be recognised by transmitting a plurality of transmit signals in respective time frames; receiving a plurality of receive signals; determining a plurality of channel impulse responses using the transmit and receive signals; defining a matrix of impulse responses, with impulse responses for adjacent time frames adjacent each other; and analysing the matrix for patterns ( | 10-30-2014 |
20140320391 | METHODS FOR IMPROVEMENTS IN MOBILE ELECTRONIC DEVICES - A series of methods are presented to improve the operation and user experience of mobile handheld devices such as mobile phones. The methods include methods allowing useful operation on low battery levels, touch input from non-conventional models, stored procedures for executing series of actions, application management, navigational communication through vibratory motions, among others. | 10-30-2014 |
20140320392 | Virtual Fixtures for Improved Performance in Human/Autonomous Manipulation Tasks - Apparatus and method for defining and utilizing virtual fixtures in haptic rendering sessions interacting with various environments, including underwater environments, are provided. A computing device can receive first depth data about an environment. The computing device can generate a first plurality of points from the first depth data. The computing device can determine a haptic interface point (HIP) and can define a virtual fixture for the environment. The computing device can determine a first force vector between the HIP and the first plurality of points using the computing device, where the first force vector is based on the virtual fixture. The computing device can send a first indication of haptic feedback based on the first force vector. | 10-30-2014 |
20140320393 | HAPTIC FEEDBACK FOR INTERACTIONS WITH FOLDABLE-BENDABLE DISPLAYS - A flexible device includes a bendable-foldable display that has bendable flaps connected by a hinge. The display has sensors for detecting a folding characteristic between the at least two flaps and for detecting a bending characteristic in at least one flap. The display has a haptic system with haptic output devices, where the haptic system receives input from the sensors indicating deformation of the bendable-foldable display device. A flexible device also includes bendable, foldable, or rollable displays that have sensors and actuators to augment user interaction with the device. Based on one or more measurements provided by the input, the haptic system interprets the input to determine deformation characteristics of the bendable-foldable display device. The haptic system generates haptic feedback based on the deformation characteristics. | 10-30-2014 |
20140320394 | Gestural motion and speech interface control method for 3d audio-video-data navigation on handheld devices - A cognizant and adaptive method of informing a multi-modal navigation interface or a user's intent. This provides the user with the experience of exploring an immersive representation of the processed multimedia (audio-video-data) sources available that automatically adapts to her/his fruition preference. These results are obtained by first reconciling and aligning the User and the Device's frames of reference in tri-dimensional space and then dynamically and adaptively Smoothly Switching and/or combining both Gesture, Motion and Speech modalities. The direct consequence is a user experience that naturally adapts to the user choice of interaction and movement. | 10-30-2014 |
20140320395 | ELECTRONIC DEVICE AND METHOD FOR ADJUSTING SCREEN ORIENTATION OF ELECTRONIC DEVICE - In a method for adjusting screen orientation of an electronic device, an orientation of the electronic device is detected using a gravity sensor of the electronic device. If the electronic device is rotated by a predetermined angle, a facial image of a user in front of the electronic device is captured by a camera of the electronic device, eyes of the user in the facial image are detected. If the eyes of the user in the facial image are in a vertical direction, the screen orientation of the electronic device is adjusted to a landscape screen orientation. If the eyes of the user in the facial image are in a horizontal direction, the screen orientation of the electronic device is adjusted to a portrait screen orientation. | 10-30-2014 |
20140320396 | PASSIVE STIFFNESS AND ACTIVE DEFORMATION HAPTIC OUTPUT DEVICES FOR FLEXIBLE DISPLAYS - A system includes a flexible display configured to display an image and a sensor connected to the flexible display. The sensor is configured to sense an amount of flexure of the flexible display. A haptic output device is connected to the flexible display and is configured to change a resistance to movement of a first portion of the flexible display relative to a second portion of the flexible display upon receipt of a haptic control signal. The system includes a processor in signal communication with the flexible display, the sensor and the haptic output device. The processor is configured to receive an output signal from the sensor based on the amount of flexure and generate the haptic control signal based on the output signal from the sensor. | 10-30-2014 |
20140320397 | System and Method For Calibrating Eye Gaze Data - A system and method are provided for calibrating an eye gaze tracking system. The method comprises obtaining gaze data; obtaining at least one key point corresponding to a portion of media content being displayed; linking the gaze data to the at least one key point; and generating one or more calibration parameters by comparing gaze data with associated ones of the at least one key point. | 10-30-2014 |
20140320398 | METHOD, ELECTRONIC DEVICE AND SYSTEM FOR REMOTE TEXT INPUT - A method, electronic device, and system for remote text input in the electronic device are provided. A display signal may be outputted in the electronic device, displaying a text field for inputting text (e.g., by the user of the electronic device). A request for text input may be sent from the electronic device to another device (e.g., a communication device) which may be addressed by an identifier of the other device (e.g., a SIM of the communication device). The electronic device may then receive the requested text input from the other device (e.g., once the user of the other device enters the requested text). | 10-30-2014 |
20140320399 | WEARABLE ELECTRONIC DEVICE AND METHOD OF CONTROLLING THE SAME - A wearable electronic device with a function of adjusting light transmittance, and a method of controlling the wearable electronic device are disclosed. The wearable electronic device according to an exemplary embodiment includes a transparent or light-transmitting lens, a liquid crystal installed at the lens, a camera taking a picture of a front view of a user wearing the wearable electronic device, a display part displaying an additional information to the lens, which is added to a front view recognized by the user, and a control part determining an operation mode of the wearable electronic device whether the operation mode is a transparent mode or an opaque mode, controlling liquid crystal to adjust light transmittance according to the determined operation mode, and controlling the display part to display the additional information when the operation mode is changed to be the transparent mode or the opaque mode. | 10-30-2014 |
20140320400 | SYSTEM AND METHOD FOR SHAPE DEFORMATION AND FORCE DISPLAY OF DEVICES - Various systems, devices, and methods for shape deformation of a haptic deformation display device are provided. For example, the haptic deformation display device may receive an input signal when the shape of the haptic deformation display device is in a first shape configuration. In response to the input signal, the haptic deformation display device may activate an actuator of the haptic deformation display device. The actuator may move a deformation component of the haptic deformation display device. The deformation component may at least partially defining a shape of the haptic deformation display device, thereby causing the shape of the haptic deformation display device to deform into a second shape configuration different from the first shape configuration. The second shape configuration may be substantially maintained. | 10-30-2014 |
20140320401 | HANDHELD ELECTRONIC DEVICE RESPONSIVE TO TILTING - PDAs can be used to provide their users with various functions. One such function is the ability to contact other users who are stored in the PDA as contacts, for example by telephone of text message. The decision as to whether or not a particular user should be contacted may be influenced by the location of that user. For example, if a contact is nearby, then it may be more likely that the user of the PDA would want to contact them. The present embodiments provide a PDA on which the location of one or more contacts can be displayed. Furthermore, the location of a particular contact can be shown on a map on the screen of the PDA simply by tilting the PDA through a predetermined angle about a horizontal axis. Thus, the map can be displayed in an intuitive and straightforward manner. | 10-30-2014 |
20140320402 | SELF CALIBRATION FOR HAPTIC DEVICES - Systems, methods and apparatuses using feedback from internal sensors to adjust haptic effect output are provided. In an embodiment, a method of generating a haptic output effect in an apparatus with a haptic effect output device is provided. A haptic effect output initiates with the haptic effect output device. The method includes receiving feedback data from an input sensor. The method compares feedback data from the input sensor to expected results of the haptic effect output. The method adjusts operating parameters of the haptic effect output device. The haptic effect output continues. | 10-30-2014 |
20140320403 | APPARATUS AND METHOD FOR RECOGNIZING MOTION BY USING AN EVENT-BASED VISION SENSOR - An apparatus and method for recognizing motion by using an event-based vision sensor is provided. An apparatus for recognizing motion using an event-based vision sensor includes: a vision sensor to sense a movement-occurring part and output events; a movement type determiner configured to determine a type of movement using a frequency of occurrence of the events outputted through the vision sensor; a first motion determiner configured to track a movement trajectory of the movement-occurring part and determine a motion pattern based on the movement trajectory in response to a result of the movement type determination unit indicating a small movement; a second motion determiner configured to determine a direction of movement direction in which an object moves based on the events in response to a result of the movement type determination indicating a large movement; and a motion controller configured to output a control instruction to control a device. | 10-30-2014 |
20140320404 | IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, AND PROGRAM - Provided is an image processing device including a recognition unit configured to recognize an environment recognition matrix representing a position and an attitude of an environment appearing in an image with respect to a position and an attitude of a terminal that has captured the image, a calculation unit configured to calculate an inverse matrix of the environment recognition matrix, and a manipulation control unit configured to control manipulation of a virtual object arranged in the environment in a three-dimensional manipulation amount according to a difference between a first position or a first attitude based on the inverse matrix of the environment recognition matrix recognized at a first time point and a second position or a second attitude based on the inverse matrix of the environment recognition matrix recognized at a succeeding second time point. | 10-30-2014 |
20140327608 | TRANSFORMING VISUALIZED DATA THROUGH VISUAL ANALYTICS BASED ON INTERACTIVITY - A data visualization application transforms visualized data through visual analytics. A detected gesture and associated visualization are processed through a visual analytics engine of the application. The visual analytics engine determines attributes for a new visualization based on the contextual information of the gesture and the visualization. The analytics engine dynamically builds an action based on the attributes. Execution of the action generates the new visualization which is rendered for presentation. | 11-06-2014 |
20140327609 | METHOD FOR GAZE-CONTROLLED TEXT SIZE CONTROL, AND METHODS FOR GAZE-BASED MEASURING OF A TEXT READING SPEED AND OF A NUMBER OF VISUAL SACCADES PER TEXT LINE - For gaze-controlled text size control of a display, the invention proposes to probe, sample and record a user's horizontal gaze Signal; to subject the gaze Signal to a subband filterbank or wavelet transform; to detect line delimiters in the gaze Signal; to derive a reading speed; to determine, as a number of saccades per text line the number of locations where the gaze Signal has sudden high slope portions surrounded on both sides by portions of markedly smaller slope; to detect, based on the reading speed and the number of saccades, a too small font size Status or a too big font size Status; and to initiate a corresponding font size change. Parts of this method can be used for gaze-based measuring of text reading speed and for gaze-based measuring of number of saccades. | 11-06-2014 |
20140327610 | CONTENT GENERATION FOR INTERACTIVE VIDEO PROJECTION SYSTEMS - Various embodiments herein include systems, methods, and software for interactive video projection system content generation. Such content is content consumed by a system that projects a scene view on a surface, such as a wall, screen, or floor, and is interactive with user motion. User motion is captured as input via a camera or other imaging device and processed on a computing device to determine where in a projected scene a user is moving. The scene is then modified based on the detected motion. A user generates content for consumption in such embodiments by providing image and variable input to populate a graphical rendering template when rendered for presentation to a user. | 11-06-2014 |
20140327611 | INFORMATION PROCESSING APPARATUS AND METHOD, AND PROGRAM - The present technology relates to information processing apparatus and method, and a program, which are capable of presenting an operation input unit in a position in which a user easily performs an input operation. An imaging unit captures an image of a vicinity of a display unit. A hand recognition unit recognizes a hand by extracting a hand image obtained by capturing a hand, based on feature information of the hand, from the image captured by the imaging unit. The hand position detection unit specifies a real position on a display unit from information on a position in an image in which the hand image is present. A display control unit for an operation input unit displays an operation input unit in a position on the display unit which is specified based on the position of the hand detected by the hand position detection unit. | 11-06-2014 |
20140327612 | MEASUREMENT METHOD AND DEVICE FOR PERFORMING THE MEASUREMENT METHOD - The present invention relates to a measurement method in which, by predetermined illumination by means of a display device, in particular a holographic or autostereoscopic display device, with an intensity distribution of the illumination light in a plane of a light source image, a first location of an object, in particular an observer of the display device, is marked, and wherein the relative position of the first location in relation to a second location of the object is determined in a coordinate system of a camera. | 11-06-2014 |
20140327613 | IMPROVED THREE-DIMENSIONAL STEREOSCOPIC RENDERING OF VIRTUAL OBJECTS FOR A MOVING OBSERVER - A system is for three-dimensional stereoscopic rendering of virtual objects in a scenario by a display screen (S) with respect to which an observer (O) can move. The system overcomes the problems of incorrect perception of three-dimensionality which are present in prior art stereoscopic rendering systems. The system includes a device ( | 11-06-2014 |
20140333521 | DISPLAY PROPERTY DETERMINATION - A display apparatus may include an optical element having thereon an information area; an eye tracker configured to detect an activity of at least one pupil within eyes; a processor configured to change a display property of the information area based at least in part on the activity of the at least one pupil. | 11-13-2014 |
20140333522 | CONTROL OF A CONTROL PARAMETER BY GESTURE RECOGNITION - A method for controlling at least one control parameter of a control element comprises determining first information indicating a distance between a part of a human body and a reference. Based on the first information, determining second information indicating a transformation relation between a gesture movement of the part of the human body and an effected change of the control parameter by the gesture movement. The transformation relation is modified based on a change of the distance between the human body and the reference. | 11-13-2014 |
20140333523 | GESTURE JUDGMENT METHOD USED IN AN ELECTRONIC DEVICE - A gesture judgment method used in an electronic device having frame capturing function is provided. A plurality of MHI (motion history image) angles/directions are obtained from a plurality of corresponding MHIs. Whether a current gesture control is valid is judged according to the MHI angles/directions. If the current gesture control is valid, then weight assignment is performed on the MHI angles/directions to obtain a judgment result of the current gesture control. | 11-13-2014 |
20140333524 | MOTION-BASED IDENTITY AUTHENTICATION OF AN INDIVIDUAL WITH A COMMUNICATIONS DEVICE - Systems, methods and computer storage mediums securely authenticate an identity of an individual based on a pattern that is traced by the individual. Embodiments of the present disclosure relate to prompting an individual with a pattern to trace when attempting to authenticate the identity of the individual during an identity authentication session. Motion-based behavior data that is generated by motions executed by the individual as the individual traces the pattern is captured via a motion-capturing sensor. The motion-based behavior data is unique to the individual and has a low likelihood of being duplicated by an unauthorized individual attempting to fraudulently pose as the individual. The captured motion-based behavior data is compared to previously captured motion-based behavior data from previous traces of the pattern completed by the individual. The identity of the individual is authenticated when the motion-based behavior data is within a threshold of the previously captured motion-based behavior data. | 11-13-2014 |
20140333525 | METHOD AND APPARATUS FOR USING AN APPLICATION TO CONTROL OPERATION WITH A DEADMAN SWITCH - The present invention provides a system and method for wirelessly controlling an operating device by way of a wireless connection between the operating device and a computing device. The computing device provides an interface via an application for controlling the operating device. The interface includes a deadman switch which halts the operation of the operating device when released. The application and interface are downloadable and installable as a package on various computing devices for use in providing a deadman switch feature for the operating device. | 11-13-2014 |
20140333526 | METHOD FOR STABILIZATION AND A SYSTEM THERETO - A method for stabilization for use in a display system is provided. The method is performed in a first electronic device comprising a first display and a gaze tracking unit configured to track a gaze of a user of the first device. The first device is arranged to send an image signal to the second device. The second device comprises a second display. The method comprises sending the image signal from the first device to the second device, controlling the image signal by the first device, depending on the tracked gaze of the user, displaying the controlled image signal on the second display such that, when the user focuses on an object of a view of the first display, the object of the view on the second display is cropped and stabilized. | 11-13-2014 |
20140333527 | METHOD AND APPARATUS FOR DISPLAYING INPUT INTERFACE IN USER DEVICE - Disclosed is a method and apparatus for displaying an input interface of a user device. The method of displaying the input interface includes: receiving an input interface display request; determining whether a preferred input mode according to a user's use record for an application exists; and when existing, displaying an input interface according to the preferred input mode. | 11-13-2014 |
20140333528 | INFORMATION PROCESSING DEVICE AND DISPLAY CONTROL METHOD - An information processing device includes: an operation unit to receive a user's operation; a first communication unit to perform communication with an external device; a first content acquisition unit to acquire first content generated in response to the user's operation; a second content acquisition unit to acquire second content received from the external device; a content storage unit to store the first and second content in association with their respective acquisition times; a display; and a display control unit. The display control unit causes the display to: display a time axis as well as first and second display regions; display the first content at a position on the time axis indicating the time associated with the first content in the first display region; and displays the second content at a position on the time axis indicating the time associated with the second content in the second display region. | 11-13-2014 |
20140333529 | APPARATUS AND METHOD OF CONTROLLING DISPLAY APPARATUS - Provided are an apparatus and a method of controlling a display apparatus. The method includes presenting a stimulus to a user by using information regarding a user environment, obtaining, from the user, an electroencephalogram (EEG) signal in response to the stimulus, and controlling the display apparatus based on the EEG signal. By using the apparatus and the method of controlling the display apparatus, an appropriate type of a stimulus may be presented to the user based on the information regarding the user environment. Thus, the user may conveniently operate the display apparatus, and may accurately operate the display apparatus according to an intention of the user. | 11-13-2014 |
20140333530 | Mobile Device Gestures - Electronic devices, interfaces for electronic devices, and techniques for interacting with such interfaces and electronic devices are described. For instance, this disclosure describes an example electronic device that includes sensors, such as multiple front-facing cameras to detect orientation and/or location of the electronic device relative to an object and one or more inertial sensors. Users of the device may perform gestures on the device by moving the device in-air and/or by moving their head, face, or eyes relative to the device. In response to these gestures, the device may perform operations. | 11-13-2014 |
20140333531 | DISPLAY APPARATUS WITH A PLURALITY OF SCREENS AND METHOD OF CONTROLLING THE SAME - A display apparatus and a control method thereof are provided. The display apparatus includes a display configured to display a plurality of images received from a plurality of sources on each of a first screen, a second screen, and a third screen of a display screen, a user interface configured to detect a user interaction, and a controller configured to control the display to move locations of the first to third screens in accordance with a detected rotation interaction in response to the rotation interaction being detected through the user interface. | 11-13-2014 |
20140333532 | STEREOSCOPIC IMAGE DISPLAY APPARATUS AND COMPUTER-READABLE RECORDING MEDIUM STORING PROGRAM THEREON - A stereoscopic image display apparatus includes: a distance measuring unit that measures a distance to a viewer; a selecting unit that selects parallax images having a binocular parallax amount corresponding to the measured distance to the viewer from parallax images of a display object based on the measured distance to the viewer as a stereoscopic image pair; and a display control unit that causes a display unit to display the selected stereoscopic display image pair. | 11-13-2014 |
20140340298 | TACTILE COMMUNICATION APPARATUS - A tactile communication apparatus that includes a signal receiver configured decode data received via a wireless signal, a tactile communication device containing a plurality of pins on one side, each pin configured to respectively move in both an outward direction and inward direction to form a plurality of pin combinations based on a plurality of activation signals, and a communication processor configured to generate the plurality of pin activation signals based on the received data so as to convey the data to a user through the plurality of pin combinations of the tactile communication device. Data conveyed to the user includes data from external events, such as sporting or theatrical events. | 11-20-2014 |
20140340299 | PORTABLE DEVICE AND CONTROL METHOD THEREOF - A portable device including a main display; a flexible display; a sensor unit configured to sense at least one of expanding of the flexible display and an input signal; and a processor configured to control the main display, the flexible display and the sensor unit, when a first page of a plurality of pages having a sequence is being displayed in the main display. The process also displays an indicator in the main display when the input signal is sensed, the indicator indicating a display direction of additional pages to be displayed when the flexible display is expanded, and indicating a forward direction or a reverse direction based on the first page as the display direction, and displays at least one additional page in the display direction indicated by the indicator when the expanding of the flexible display is sensed. | 11-20-2014 |
20140340300 | SYSTEM AND METHOD FOR USING HANDHELD DEVICE AS WIRELESS CONTROLLER - A method and system for using handheld device ( | 11-20-2014 |
20140340301 | HAND MOTION-BASED DEVICE CONTROL - The present invention seeks to provide a user interface for use with a pair of gloves that controls at least one function of a device. The gloves are constructed and arranged to restrict the use of individual fingers of a wearer. The user interface comprises a sensor operatively configured to detect and transmit wearer input mediated through specific hand motions performed by the wearer. A processor is in operative communication with the sensor such that the processor is configured to receive, interpret, and process wearer input. Specific hand motions are detected by the sensor, transmitted to the processor, interpreted to correspond to at least one specific function of the device, and processed to execute the at least one specific function. | 11-20-2014 |
20140340302 | INTEGRATED GESTURE SENSOR MODULE - An integrated gesture sensor module includes an optical sensor die, an application-specific integrated circuit (ASIC) die, and an optical emitter die disposed in a single package. The optical sensor die and ASIC die can be disposed in a first cavity of the package, and the optical emitter die can be disposed in a second cavity of the package. The second cavity can be conical or step-shaped so that the opening defining the cavity increases with distance from the upper surface of the optical emitter die. The upper surface of the optical emitter die may be higher than the upper surface of the optical sensor die. An optical barrier positioned between the first and second cavities can include a portion of a pre-molded, laminate, or ceramic package, molding compound, and/or metallized vias. | 11-20-2014 |
20140340303 | DEVICE AND METHOD FOR DETERMINING GESTURE - A device and method are provided for determining a gesture. When a single form is generated through a gesture, a type of gesture generating the single form is detected. A function corresponding to the detected type of gesture is performed. | 11-20-2014 |
20140340304 | EFFICIENT FETCHING OF A MAP DATA DURING ANIMATION - A first digital map is displayed in a viewport at an initial position. When a user gesture that communicates motion to the viewport is detected, a trajectory of the viewport from the initial position to a target position is determined based on kinematic quantities of the communicated motion. Map data for displaying a second digital map in the viewport at the target position is retrieved from a first memory, prior to the viewport reaching the target position. The retrieved map data is stored in a second memory having a higher speed of access than the first memory. The second memory is retrieved for display via the user interface when the viewport is at the target position. | 11-20-2014 |
20140340305 | COMPUTER INPUT DEVICE AND METHOD OF USING THE SAME - An auxiliary input device for use in inputting a user signal to a computer is provided. The computer auxiliary input device includes an infrared ray transmitter configured to transmit an infrared ray signal to a left eyeball and a right eyeball of a user. A reflected light sensor configured to measure a change in an amount of light reflected from the left eyeball and the right eyeball. A controller configured to detect the change in the amount of light reflected from each of the eyeballs from the reflected light sensor, generate Morse code alphabets based on the detected change, and convert the Morse code alphabets into a character string to be transmitted to the computer. A computer communication unit is configured to provide the character string converted by the controller to the computer. | 11-20-2014 |
20140340306 | Bendable Electronic Device Status Information System and Method - A system includes, but is not limited to: obtaining and one or more physical status sending modules configured to direct sending one or more bendable electronic device physical status related information portions to the bendable electronic device based upon the obtaining of the first information. In addition to the foregoing, other related system/system aspects are described in the claims, drawings, and text forming a part of the present disclosure. | 11-20-2014 |
20140340307 | APPARATUS AND METHOD FOR RECOOGNIZING SUBJECT MOTION USING A CAMERA - A method and an apparatus are provided for displaying pictures according to hand motion inputs. An application is executed for displaying a picture from among a sequence of pictures on a display. Groups of skin color blocks corresponding to a hand are detected from among image frames output from a camera. A motion is detected among the groups of skin color blocks. Direction information is obtained on the detected motion. The application is controlled to display a previous picture or a next picture in the sequence of pictures on the display according to the direction information. | 11-20-2014 |
20140347262 | OBJECT DISPLAY WITH VISUAL VERISIMILITUDE - Described herein are technologies relating to display of a representation of an object on a display screen with visual verisimilitude to a viewer. A location of eyes of the viewer relative to a reference point on the display screen is determined. Additionally, a direction of gaze of the eyes of the viewer is determined. Based upon the location and direction of gaze of the eyes of the viewer, the representation of the object can be displayed at a scale and orientation such that it appears with visual verisimilitude to the viewer. | 11-27-2014 |
20140347263 | Motion-Assisted Visual Language For Human Computer Interfaces - Embodiments of the invention recognize human visual gestures, as captured by image and video sensors, to develop a visual language for a variety of human computer interfaces. One embodiment of the invention provides a computer-implement method for recognizing a visual gesture portrayed by a part of human body such as a human hand, face or body. The method includes steps of receiving the visual signature captured in a video having multiple video frames, determining a gesture recognition type from multiple gesture recognition types including shaped-based gesture, position-based gesture, motion-assisted and mixed gesture that combining two different gesture types. The method further includes steps of selecting a visual gesture recognition process based on the determined gesture type and applying the selected visual gesture recognition process to the multiple video frames capturing the visual gesture to recognize the visual gesture. | 11-27-2014 |
20140347264 | DEVICE AND METHOD FOR DISPLAYING AN ELECTRONIC DOCUMENT USING A DOUBLE-SIDED DISPLAY - A device and method are provided for displaying an electronic document using a double-sided display. The method includes displaying a portion of the electronic document on a first display of the double-sided display; sensing a motion of the double-sided display; and displaying another portion of the electronic document on a second display of the double-sided display, based on the sensed motion. | 11-27-2014 |
20140347265 | WEARABLE COMPUTING APPARATUS AND METHOD - A method is provided, performed by a wearable computing device comprising at least one bio-signal measuring sensor, the at least one bio-signal measuring sensor including at least one brainwave sensor, comprising: acquiring at least one bio-signal measurement from a user using the at least one bio-signal measuring sensor, the at least one bio-signal measurement comprising at least one brainwave state measurement; processing the at least one bio-signal measurement, including at least the at least one brainwave state measurement, in accordance with a profile associated with the user; determining a correspondence between the processed at least one bio-signal measurement and at least one predefined device control action; and in accordance with the correspondence determination, controlling operation of at least one component of the wearable computing device, such as modifying content displayed on a display of the wearable computing device. Various types of bio-signals, including brainwaves, may be measured and used to control the device in various ways. | 11-27-2014 |
20140347266 | LIGHTING EQUIPMENT AND IMAGE PROJECTOR - An lighting equipment having an illumination capability and an image projection capability includes a light source unit ( | 11-27-2014 |
20140347267 | DISPLAY APPARATUS AND DISPLAY CONTROL METHOD - An apparatus includes a display controller configured to control a display state of a transparent display based on a sensor output that contains information regarding an object separated from the apparatus. | 11-27-2014 |
20140347268 | DATA SERVICES BASED ON GESTURE AND LOCATION INFORMATION OF DEVICE - With the addition of directional information and gesture based input in a location based services environment, a variety of service(s) can be provided on top of user identification or interaction with specific object(s) of interest. For instance, when a user gestures at or points at a particular item, or gestures at a particular location or place, this creates an opportunity, e.g., an advertising opportunity, for anyone having an interest in that particular item or place to communicate with the user regarding that item or related items at a point in time when the user's focus is on the particular item. User context for the interaction can also be taken into account to supplement the provision of one or more interactive direction based services. | 11-27-2014 |
20140347269 | TELE-PRESENCE SYSTEM WITH A USER INTERFACE THAT DISPLAYS DIFFERENT COMMUNICATION LINKS - A tele-presence system that includes a remote device coupled to a control station through a communication link. The remote device includes a remote monitor, a remote camera, a remote speaker and a remote microphone. Likewise, the control station includes a station monitor, a station camera, a station speaker and a station microphone. The control station displays a plurality of graphical icons that each represents a different type of communication link between the control station and the remote device. The graphical icons can be selected to allow a user of the control station to change the communication link between the remote device and its initial node. | 11-27-2014 |
20140347270 | SYSTEM AND METHOD FOR DISPLAY OF MULTIPLE DATA CHANNELS ON A SINGLE HAPTIC DISPLAY - A system that produces a haptic effect and generates a drive signal that includes at least two haptic effect signals each having a priority level. The haptic effect is a combination of the haptic effect signals and priority levels. The haptic effect may optionally be a combination of the two haptic effect signals if the priority levels are the same, otherwise only the haptic effect signal with the highest priority is used. The frequency of haptic notifications may also be used to generate a drive signal using foreground and background haptic effect channels depending on whether the frequency ratio exceeds a foreground haptic effect threshold. | 11-27-2014 |
20140347271 | METHOD FOR EVALUATING SPLINE PARAMETERS FOR SMOOTH CURVE SAMPLING - A computer implemented method and apparatus for reproducing an input curve on a mobile device comprising detecting the input curve, sampling the input curve into a discrete set of vertices, performing a normalized dot product of an edge leading into each one of the discrete set of vertices and a tangent to an edge leading from each one of the discrete set of vertices, setting one or more spline stiffness parameter based on the normalized dot product corresponding to each vertex; and converting the sampled input curve into one or more spline patch with the set spline stiffness parameter corresponding to each vertex. | 11-27-2014 |
20140347272 | AUDIO, VIDEO, SIMULATION, AND USER INTERFACE PARADIGMS - Consumer electronic devices have been developed with enormous information processing capabilities, high quality audio and video outputs, large amounts of memory, and may also include wired and/or wireless networking capabilities. Additionally, relatively unsophisticated and inexpensive sensors, such as microphones, video camera, GPS or other position sensors, when coupled with devices having these enhanced capabilities, can be used to detect subtle features about users and their environments. A variety of audio, video, simulation and user interface paradigms have been developed to utilize the enhanced capabilities of these devices. These paradigms can be used separately or together in any combination. One paradigm automatically creating user identities using speaker identification. Another paradigm includes a control button with 3-axis pressure sensitivity for use with game controllers and other input devices. | 11-27-2014 |
20140354527 | PERFORMING AN ACTION ASSOCIATED WITH A MOTION BASED INPUT - A method implemented by a computing device having a memory and a motion sensor detection sensor is disclosed. The memory stores an input model associated with a gesture input. The method includes detecting a motion of the computing device; matching the detected motion with the input model; and determining a confidence level associated with the match. If the confidence level is above a pre-determined threshold, an action associated with the gesture input is performed automatically by the processor; and if the confidence level is below the pre-determined threshold, the action is performed by the processor responsive to receiving a positive confirmation of the gesture input. | 12-04-2014 |
20140354528 | UBIQUITOUS NATURAL USER SYSTEM - A system is provided for that includes sensor(s) configured to provide sensed input characteristic of a user, the user's environment or the user's interaction with their environment, and including from a 3D scanner, measurements of points on a surface of an object in the user's environment. A front-end system may receive and process the sensed input including the measurements to identify a known pattern that indicates a significance of the sensed input from which to identify operations of electronic resource(s). The front-end system may form and communicate an input to the electronic resource(s) to cause the resource(s) to perform the operations, including generation of a point cloud from the measurements, and transformation of the point cloud to a 3D model of the object. And the front-end system may receive an output including the 3D model from the resource(s), and communicate the output for display by a display device. | 12-04-2014 |
20140354529 | TRACKING A USER TO SUPPORT TASKS PERFORMED ON COMPLEX-SYSTEM COMPONENTS - A system is provided that includes sensors(s) configured to provide sensed input including measurements of motion and/or orientation of a user during performance of a task to work a complex-system component. The system includes a front-end system configured to process the sensed input including the measurements to identify a known pattern that indicates a significance of the sensed input from which to identify operations of an electronic resource. The front-end system is configured to form and communicate an input to cause the electronic resource to perform the operations and produce an output. The operations include determination of an action of the user during performance of the task, or calculation of a process variable related to performance of the task, from the measurements. And the front-end system is configured to receive the output from the electronic resource, and communicate the output to a display device, audio output device or haptic sensor. | 12-04-2014 |
20140354530 | Method and Apparatus for Program Utilization of Display Area - A method comprising determining that an apparatus is in a partial display physical configuration, receiving an indication of an input that identifies a program to operate in a greater display physical configuration, determining that the apparatus has become configured in the greater display physical configuration, and causing the program to utilize at least part of a display area and at least part of another display area is disclosed. | 12-04-2014 |
20140354531 | GRAPHICAL USER INTERFACE - In one example in accordance with the present disclosure, a computing system is provided. The system comprises a user detection module, a distance detection module, and a presentation module. The user detection module is to detect a user operating the computing system and determine information about the user. The distance detection module is to determine the distance to the user operating the computing system. The presentation module is to generate a graphical user interface based at least on the information about a user operating the computing system and the distance to the user operating the computing system, where the graphical user interface is either a default graphical user interface or a distance graphical user interface. | 12-04-2014 |
20140354532 | MANIPULATION OF VIRTUAL OBJECT IN AUGMENTED REALITY VIA INTENT - A system and method for manipulating a virtual object based on intent is described. A reference identifier from a physical object is captured. The reference identifier is communicated via a network to a remote server. The remote server includes virtual object data associated with the reference identifier. The virtual object data is received at the computing device. The virtual image is displayed in a virtual landscape using the virtual object data. In response to relative movement between the computing device and the physical object caused by a user, the virtual image is modified. Brain activity data of the user is received. A state of the virtual object in the virtual landscape is changed based on the brain activity data. | 12-04-2014 |
20140354533 | TAGGING USING EYE GAZE DETECTION - Various embodiments relating to tagging human subjects in images are provided. In one embodiment, an image including a human subject is presented on a display screen. A dwell location of a tagging user's gaze on the display screen is received. The human subject in the image is recognized as being located at the dwell location. An identification of the human subject is received, and the image is tagged with the identification. | 12-04-2014 |
20140354534 | MANIPULATION OF VIRTUAL OBJECT IN AUGMENTED REALITY VIA THOUGHT - A system and method for manipulating a virtual object based on thought is described. A reference identifier from a physical object is captured. Brain activity data of a user is received to obtain a brain activity data. The reference identifier and the brain activity data are communicated via a network to a remote server. The remote server selects a virtual object based on the reference identifier and the brain activity data. A communication from the server identifying the virtual object is received. The virtual object is displayed in a virtual landscape. | 12-04-2014 |
20140354535 | SYSTEM AND METHOD OF DISPLAY DEVICE - The invention proposes a system and method of a display device. When the distance between a user and the display device is too close or far, or the angle between the user and the display device slants too much, the display device sends out a command to inform the user to adjust the user's position while watching the display device. A method for the display device comprises following steps: a detecting device and an operating system calculating a relative position between the user and the display device, the operating system generating an warning command according to the relative position, a warning module receiving the warning command and afterward sending out the command, the command being sent to a display module, a motion-generating device and an audio device. | 12-04-2014 |
20140354536 | ELECTRONIC DEVICE AND CONTROL METHOD THEREOF - An electronic device including a projection module and a control method thereof are disclosed. The electronic device includes: a display module; a camera configured to capture a user gesture; the projection module configured to project a virtual control object on a plane within a predetermined distance from the electronic device; and a controller configured to display on the display module a screen associated with operations running on the electronic device, and upon receiving a predetermined user gesture through the camera, project a virtual control object on the plane through the projection module to control at least one of the operations running on the electronic device at the time when the user gesture is received. Accordingly, a virtual control object for controlling an operation running on the electronic device can be projected on a plane. | 12-04-2014 |
20140354537 | APPARATUS AND METHOD FOR PROCESSING USER INPUT USING MOTION OF OBJECT - A user input processing apparatus using a motion of an object to determine whether to track the motion of the object, and track the motion of the object using an input image including information associated with the motion of the object. | 12-04-2014 |
20140354538 | METHOD AND APPARATUS FOR OPERATING NOTIFICATION FUNCTION IN USER DEVICE - A method and an apparatus for operating a reception notification function in a user device are provided. The method includes detecting a face of a user in images collected through a camera, when a notification event is detected, and executing a notification function in either of a silent mode and a vibration mode when the notification event is detected, when a face of a user is detected in images collected through a camera. | 12-04-2014 |
20140354539 | GAZE-CONTROLLED USER INTERFACE WITH MULTIMODAL INPUT - A personal computer system provides a gaze-controlled graphical user interface having a bidirectional and a unidirectional interaction mode. In the bidirectional interaction mode, a display shows one or more graphical controls in motion, each being associated with an input operation to an operating system. A gaze tracking system provides gaze point data of a viewer, and a matching module attempts to match a relative gaze movement against a relative movement of one of the graphical controls. The system includes a selector which is preferably controllable by a modality other than gaze. The system initiates a transition from the unidirectional interaction mode to the bidirectional interaction mode in response to an input received at the selector. The display then shows graphical controls in motion in a neighbourhood of the current gaze point, as determined based on current gaze data. | 12-04-2014 |
20140354540 | SYSTEMS AND METHODS FOR GESTURE RECOGNITION - According to various aspects, systems and methods are disclosed for the implementation of a touch-free, gesture based user interface for mobile computing systems. Aspects of the system describe components used for capturing digital video frames from a video stream using the native hardware of the mobile computing system. Further aspects of the system describe efficient object recognition and tracking components capable of recognizing the motion of a defined model object. Various aspects of the disclosed system provide systems and processes for associating the recognized motion of a model object with a gesture, as well as associating the gesture with a user interface operation. | 12-04-2014 |
20140354541 | WEBPAGE BROWSING METHOD AND APPARATUS BASED ON PHYSICAL MOTION - The present disclosure, pertaining to the field of computer technologies, discloses a webpage browsing method and apparatus based on physical motion. The method includes: acquiring spatial location information of a mobile terminal in a relatively stationary state; upon detecting a motion change of the mobile terminal, acquiring spatial location information and motion information of the mobile terminal experiencing the motion change; if motion information of the mobile terminal is greater than a predetermined threshold, comparing the spatial location information of the mobile terminal in the relatively stationary state with the spatial location information of the mobile terminal experiencing the motion change to acquire a spatial location offset after the motion change; and performing a corresponding browse operation on a webpage on the mobile terminal according to a mapping relationship between the spatial location offset and a webpage operation. | 12-04-2014 |
20140361971 | VISUAL ENHANCEMENTS BASED ON EYE TRACKING - Embodiments are disclosed that relate to selectively enhancing content displayed on a display device. For example, one disclosed embodiment provides a method comprising tracking a gaze location at which a gaze of a user intersects the graphical user interface, and upon detecting a gaze pattern that meets a defined gaze pattern condition indicative of difficulty in perceiving an item at the gaze location, displaying the item in at the gaze location a visually enhanced form via a user interface object displayed separately from the item at the gaze location. | 12-11-2014 |
20140361972 | SYSTEM AND METHOD FOR VOLUMETRIC COMPUTING - A system and method is provided for analyzing the movement of an object. The system is comprised of a sensor that is coupled to a processor that is configured to generate a volume that corresponds to a predefined area and the location of the object. The sensor detects the movement of the object and determines if the object's movement is within the volume. If the movement is within the volume, then an information channel is created between the volume and the user defined algorithm. The movement is then compared with the requirements of the user defined algorithm and accepted if the movement meets the requirements. Otherwise, the object's movement is discarded if the movement is not within the volume or does not meet the requirements of the user defined algorithm. | 12-11-2014 |
20140361973 | SYSTEM AND METHOD FOR MULTIMODAL HUMAN-VEHICLE INTERACTION AND BELIEF TRACKING - A method and system for multimodal human-vehicle interaction including receiving input from an occupant in a vehicle via more than one mode and performing multimodal recognition of the input. The method also includes augmenting at least one recognition hypothesis based on at least one visual point of interest and determining a belief state of the occupant's intent based on the recognition hypothesis. The method further includes selecting an action to take based on the determined belief state. | 12-11-2014 |
20140361974 | KARAOKE AVATAR ANIMATION BASED ON FACIAL MOTION DATA - Apparatus, systems, media and/or methods may involve animating avatars. User facial motion data may be extracted that corresponds to one or more user facial gestures observed by an image capture device when a user emulates a source object. An avatar animation may be provided based on the user facial motion data. Also, script data may be provided to the user and/or the user facial motion data may be extracted when the user utilizes the script data. Moreover, audio may be captured and/or converted to a predetermined tone. Source facial motion data may be extracted and/or an avatar animation may be provided based on the source facial motion data. A degree of match may be determined between the user facial motion data of a plurality of users and the source facial motion data. The user may select an avatar as a user avatar and/or a source object avatar. | 12-11-2014 |
20140361975 | MANAGEMENT OF INPUT METHODS - Embodiments of a system and method are disclosed concerning the management of a plurality input methods on a computer. The input method may define how input from a human interface device is interpreted by a receiving application. The method may include locating one or more input method factors. The method may also include determining a numerical score of the input method factor using a metric of the input method factor and a weight of the factor. The method may also include ranking an input method factor using the determined numerical score of the input method factor. The method may also include presenting a user selection interface that lists the plurality of input methods as a function of the ranking. | 12-11-2014 |
20140361976 | SWITCHING MODE OF OPERATION IN A HEAD MOUNTED DISPLAY - Methods, systems, and computer programs are presented for managing the display of images on a head mounted device (HMD). One method includes an operation for tracking the gaze of a user wearing the HMD, where the HMD is displaying a scene of a virtual world. In addition, the method includes an operation for detecting that the gaze of the user is fixed on a predetermined area for a predetermined amount of time. In response to the detecting, the method fades out a region of the display in the HMD, while maintaining the scene of the virtual world in an area of the display outside the region. Additionally, the method includes an operation for fading in a view of the real world in the region as if the HMD were transparent to the user while the user is looking through the region. The fading in of the view of the real world includes maintaining the scene of the virtual world outside the region. | 12-11-2014 |
20140361977 | IMAGE RENDERING RESPONSIVE TO USER ACTIONS IN HEAD MOUNTED DISPLAY - Methods, systems, and computer programs are presented for rendering images on a head mounted display (HMD). One method includes operations for tracking, with one or more first cameras inside the HMD, the gaze of a user and for tracking motion of the HMD. The motion of the HMD is tracked by analyzing images of the HMD taken with a second camera that is not in the HMD. Further, the method includes an operation for predicting the motion of the gaze of the user based on the gaze and the motion of the HMD. Rendering policies for a plurality of regions, defined on a view rendered by the HMD, are determined based on the predicted motion of the gaze. The images are rendered on the view based on the rendering policies. | 12-11-2014 |
20140361978 | PORTABLE COMPUTER MONITORING - Disclosed is a method of predicting the condition of a portable computer comprising a motion sensor on a computer system, the method comprising collecting motion data from said motion sensor; periodically sending said collected motion data from the portable computer to the computer system; evaluating said motion data on the computer system; and predicting said condition from the evaluated motion data. Computer program products, a portable computer, and a system for implementing aspects of the method are also disclosed. | 12-11-2014 |
20140361979 | GRIP SENSOR DEVICE AND GRIP SENSING METHOD - A grip sensor device and a grip sensing method are provided. The grip sensor device includes an antenna that is formed of metal within a mobile electronic device and communicates a signal of a first frequency and a grip sensor module that is formed within the mobile electronic device, is electrically connected to the antenna, and outputs a proximity detection signal according to proximity of an external object to the antenna. | 12-11-2014 |
20140361980 | INFORMATION PROCESSOR AND PROGRAM - A novel bendable and highly portable information processor is provided. In addition, a novel information processor capable of displaying information or the like on a seamless large screen is provided. A novel information processor in which one display region can be divided into two regions at a bend position is provided. A novel information processor in which different images or images for different purposes can be displayed in the respective regions is provided. The present inventors have conceived a program including a step of dividing the display region and displaying image data in the respective regions when a display unit of the information processor is bent. | 12-11-2014 |
20140361981 | INFORMATION PROCESSING APPARATUS AND METHOD THEREOF - An image obtained by capturing a gesture input region is acquired, and an object that makes a gesture is detected from the image. An intersection position at which the detected object crosses a determination region used to determine the position of the object with respect to the gesture input region is detected from the image. The base position of the gesture made by the object is computed based on the intersection position. The position of a target of manipulation by the gesture is determined as a manipulation target position. A gesture coordinate system different from the coordinate system of the image is determined based on the base position and the manipulation target position. | 12-11-2014 |
20140361982 | PROXY GESTURE RECOGNIZER - An electronic device displays one or more views. A first view includes a plurality of gesture recognizers. The plurality of gesture recognizers in the first view includes one or more proxy gesture recognizers and one or more non-proxy gesture recognizers. Each gesture recognizer indicates one of a plurality of predefined states. A first proxy gesture recognizer in the first view indicates a state that corresponds to a state of a respective non-proxy gesture recognizer that is not in the first view. The device delivers a respective sub-event to the respective non-proxy gesture recognizer that is not in the first view and at least a subset of the one or more non-proxy gesture recognizers in the first view. The device processes the respective sub-event in accordance with states of the first proxy gesture recognizer and at least the subset of the one or more non-proxy gesture recognizers in the first view. | 12-11-2014 |
20140361983 | REAL-TIME STROKE-ORDER AND STROKE-DIRECTION INDEPENDENT HANDWRITING RECOGNITION - Methods, systems, and computer-readable media related to a technique for providing handwriting input functionality on a user device. A handwriting recognition module is trained to have a repertoire comprising multiple non-overlapping scripts and capable of recognizing tens of thousands of characters using a single handwriting recognition model. The handwriting input module provides real-time, stroke-order and stroke-direction independent handwriting recognition for multi-character handwriting input. In particular, real-time, stroke-order and stroke-direction independent handwriting recognition is provided for multi-character, or sentence level Chinese handwriting recognition. User interfaces for providing the handwriting input functionality are also disclosed. | 12-11-2014 |
20140361984 | VISIBILITY IMPROVEMENT METHOD BASED ON EYE TRACKING, MACHINE-READABLE STORAGE MEDIUM AND ELECTRONIC DEVICE - A visibility improvement method using gaze tracking includes detecting a gaze of a user using a camera; determining a focus object at which the user gazes from among at least one object viewed through a display unit of an electronic device; and displaying, on the display unit, an image with high visibility that corresponds to the focus object and has higher visibility than the focus object. | 12-11-2014 |
20140361985 | INFORMATION PROCESSING APPARATUS CAPABLE OF RECOGNIZING USER OPERATION AND METHOD FOR CONTROLLING THE SAME - An information processing apparatus according to the present invention can improve the operability when a user performs a gesture operation relating to a target object. | 12-11-2014 |
20140361986 | ELECTRONIC DEVICE AND METHOD FOR CONTROLLING THE ELECTRONIC DEVICE BASED ON IMAGE DATA DETECTED THROUGH A PLURALITY OF CAMERAS - An electronic device including a first camera and a second camera and method for controlling the electronic device are provided. The method includes detecting a user through the first camera; detecting the user through the second camera; and performing a preset function in the electronic device, based on a sequence of the detecting of the user through the first camera and the second camera. | 12-11-2014 |
20140361987 | Eye controls - A head mountable display (HMD) system in which images are generated for display to the user comprises an eyes-shut detector configured to detect a user's eye or eyes being shut; and an image generator configured to generate images for display by the HMD, the image generator being configured to carry out actions in response, at least in part, to a detection of whether the HMD wearer has one or both eyes shut for at least a first predetermined period, the image generator executing different respective actions in dependence upon: (i) a detection that the user's eyes are shut for at least the first predetermined period but less than a second predetermined period, the second predetermined period being longer than the first predetermined period; and (ii) a detection that the user's eyes are shut for at least the second predetermined period. | 12-11-2014 |
20140361988 | Touch Free Interface for Augmented Reality Systems - A method and system for augmented reality. Images of a real world scene are obtained from one or more from one or more image sensors. An orientation and/or location of the image sensors is obtained from one or more state sensors. A real world object at which a predefined pointing object is performing a predefined gesture in the images of the real world scene is identified and data associated with the identified object is displayed on a viewing device. The invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer. | 12-11-2014 |
20140361989 | Method and Device for Operating Functions in a Vehicle Using Gestures Performed in Three-Dimensional Space, and Related Computer Program Product - A method to control functions in a vehicle using gestures carried out in three-dimensional space. When it is determined that a first gesture carried out in three-dimensional space is detected using an image-based detection procedure it is determined whether the first gesture is a gesture allocated to an activation of an operation of a function. If it is that the detected first gesture is the gesture allocated to the activation of the function, then the function is activated. Next, a second a second gesture carried out in three-dimensional space is detected using the image-based detection procedure and it is determined whether the detected second gesture is a gesture allocated to the operation of the function. If it is determined that the second gesture has been detected, an operation of the function is performed. | 12-11-2014 |
20140361990 | DISPLAY WITH OBSERVER TRACKING - The invention relates to a display, in particular an autostereoscopic or holographic display, for representing preferably three-dimensional information, wherein the stereo views or the reconstructions of the holographically encoded objects can be tracked to the movements of the associated eyes of one or more observers in a finely stepped manner within a plurality of zones of the movement region. In this case, the zones are selected by the activation of switchable polarization gratings. | 12-11-2014 |
20140368421 | AUTOMATICALLY SWITCHING BETWEEN INPUT MODES FOR A USER INTERFACE - A system and machine-implemented method for automatically switching between input modes for a user interface. A user interface is displayed in a first input mode for a first input type. A determination is made that a predetermined event has occurred, wherein the predetermined event is of a type other than the first input type or a second input type. In response to the determination, the user interface is switched from the first input mode for the first input type to a second input mode for the second input type. | 12-18-2014 |
20140368422 | SYSTEMS AND METHODS FOR PERFORMING A DEVICE ACTION BASED ON A DETECTED GESTURE - Systems and methods for performing an action based on a detected gesture are provided. The systems and methods provided herein may detect a direction of an initial touchless gesture and process subsequent touchless gestures based on the direction of the initial touchless gesture. The systems and methods may translate a coordinate system related to a user device and a gesture library based on the detected direction such that subsequent touchless gestures may be processed based on the detected direction. The systems and methods may allow a user to make a touchless gesture over a device to interact with the device independent of the orientation of the device since the direction of the initial gesture can set the coordinate system or context for subsequent gesture detection. | 12-18-2014 |
20140368423 | METHOD AND SYSTEM FOR LOW POWER GESTURE RECOGNITION FOR WAKING UP MOBILE DEVICES - Embodiments of the present invention provide a novel solution which leverages peripheral resources used during the performance of system wake events to detect the presence of gesture input provided by a user during power saving operations (e.g., sleep modes). During the occurrence of a system wake event, embodiments of the present invention utilize proximity detection capabilities of the mobile device to determine if a user is within a detectable distance of the device to provide possible gesture input. When a positive detection comes in, embodiments of the present invention may use the light intensity (e.g., brightness level) measuring capabilities of the mobile device to further determine whether the user is attempting to engage the device to provide gesture input or if the device was unintentionally engaged. Once determinations are made that a user is waiting to engage the gesture recognition capabilities of the mobile device, embodiments of the present invention rapidly activate the gesture recognition engine (e.g., gesture sensor) and may coincidentally notify the user (e.g., using LED notification) that the device is ready to accept gesture input from the user. | 12-18-2014 |
20140368424 | PRESENTATION DEVICE AND METHOD FOR OPERATING THE DEVICE - A presentation device and an method of operating the presentation device may quickly move to a desired area on a map by adjusting a level of an electromyographic (EMG) signal even though a range of a motion of an arm is limited when searching the map using a wearable mobile device. The presentation device includes a first sensor unit configured to contact a measurement target and to obtain an electromyographic (EMG) signal from the measurement target; and a display controller configured to control a field of view (FOV) window for displaying a map area based on the EMG signal. | 12-18-2014 |
20140368425 | ADJUSTING A TRANSPARENT DISPLAY WITH AN IMAGE CAPTURING DEVICE - A system and a method for adjusting a transparent display with an image capturing device are provided. The system includes a command module to instigate a monitoring of the multi-layer transparent display with the image capturing device; an image receiving module to receive a first image from the image capturing device and a second image from the image capturing device after a predetermined interval; an image processing module to process the received first image and the second image; and a display driving module to re-render the multi-layer transparent display based on the processing of the received first image and the second image. | 12-18-2014 |
20140368426 | IMAGE PROCESSING SYSTEM, IMAGE PROCESSING APPARATUS, STORAGE MEDIUM HAVING STORED THEREIN IMAGE PROCESSING PROGRAM, AND IMAGE PROCESSING METHOD - A marker image that, if included in a captured image captured by an image capturing apparatus, performs image processing on the captured image and is thereby allowed to cause a predetermined virtual object to appear is placed in a virtual space. Then, a viewpoint for displaying a part of the virtual space on a display apparatus is changed in accordance with a user input, and the virtual space viewed from the viewpoint is displayed on the display apparatus. | 12-18-2014 |
20140368427 | Controlling Apparatus For Physical Effects In Cyberspace And Method Thereof - Provided are a controlling apparatus for physical effects and a method thereof. After setting up a mapping relationship between at least one marker attached to a human body and an at least one effect output unit, a three-dimensional coordinate of the marker is recognized to be mapped onto cyberspace, and upon entry of the marker into a pre-set area within cyberspace, a physical effect output command is transmitted to the at least one effect output unit. | 12-18-2014 |
20140368428 | System and Method For Enhanced Gesture-Based Interaction - Described herein is a wireless remote control device ( | 12-18-2014 |
20140368429 | DEVICE FOR GESTURAL CONTROL OF A SYSTEM, AND ASSOCIATED METHOD - A device (DISP) for gestural control of a system (SYST) includes a grippable mobile control element (TC), a movement sensor assembly (EC) for measuring movements of the mobile control element (TC), and a processing circuit (DET) for detection the rotation or translation of the mobile control element in relation to an axis that is substantially invariant over a time window. The device (DISP) also includes a circuit (REG) for adjusting the value of at least one parameter of the system, and a circuit (CMD) for controlling activation/deactivation of the adjustment upon a detection by the detection circuit (DET) of a rotation and/or translation of the mobile control element in relation to an axis that is substantially invariant over a time window. | 12-18-2014 |
20140368430 | TOUCH STAMP FOR PORTABLE TERMINAL EMPLOYING TOUCHSCREEN, AND AUTHENTICATION SYSTEM AND METHOD USING THE SAME - The present invention relates to a method for performing an authentication using a touch stamp which can perform a touch input operation for inputting a pattern that may not be easily directly inputted by a human to a portable terminal which supports multiple touches. The touch stamp of the present invention comprises: a body having one flat end in which one or more through holes are formed; and one or more touch protrusions accommodated in the body and selectively protruding outwardly from said one end of the body through the through holes according to a touch pattern having at least one of a preset time pattern and a preset location pattern. The present invention has advantages of enabling a touch input operation for inputting a fine pattern on a touch screen of a portable terminal. | 12-18-2014 |
20140368431 | Object Information Derived From Object Images - Search terms are derived automatically from images captured by a camera equipped cell phone, PDA, or other image capturing device, submitted to a search engine to obtain information of interest, and at least a portion of the resulting information is transmitted back locally to, or nearby, the device that captured the image. | 12-18-2014 |
20140375538 | DISPLAY APPARATUS INCORPORATING CONSTRAINED LIGHT ABSORBING LAYERS - This disclosure provides systems, methods and apparatus for modulating light for a display. The system includes a light blocking layer including a reflective layer and a light absorbing layer. The light blocking layer is configured such that any conductive components therein underlie or cover less than a majority of the circuitry controlling the display elements incorporated into the display. | 12-25-2014 |
20140375539 | Method and Apparatus for a Virtual Keyboard Plane - Two cameras forming a 3-D camera system are used to project the key pattern of the plane from the displayed keyboard of a small display of a smartphone to that of a larger initialization plane of a virtual keyboard which is displaced in a parallel plane and increased in size from the displayed keyboard plane. The displayed keyboard is located on a screen of the display screen of the smartphone. The angular variation of the finger's position from the displayed keyboard plane based on the camera image indicates the keyboard character being depressed. Plenoptic cameras can also be used. The displacement distance or baseline of the plenoptic cameras is advantageous to increase. Highlighting the keys on the displayed keyboard when the fingers are in the initialization plane or activation plane of a virtual keyboard either by color, shading, or any other visual means provides positive feedback to the user. | 12-25-2014 |
20140375540 | SYSTEM FOR OPTIMAL EYE FIT OF HEADSET DISPLAY DEVICE - A system and method are disclosed for sensing a position and/or angular orientation of a head mounted display device respect to a wearer's eyes, and to provide feedback for adjusting the position and/or angular orientation of the head mounted display device so as to be optimally centered and oriented with respect to the wearer's eyes. | 12-25-2014 |
20140375541 | EYE TRACKING VIA DEPTH CAMERA - Embodiments are disclosed that relate to tracking a user's eye based on time-of-flight depth image data of the user's eye are disclosed. For example, one disclosed embodiment provides an eye tracking system comprising a light source, a sensing subsystem configured to obtain a two-dimensional image of a user's eye and depth data of the user's eye using a depth sensor having an unconstrained baseline distance, and a logic subsystem configured to control the light source to emit light, control the sensing subsystem to acquire a two-dimensional image of the user's eye while illuminating the light source, control the sensing subsystem to acquire depth data of the user's eye, determine a gaze direction of the user's eye from the two-dimensional image, determine a location on a display at which the gaze direction intersects the display based on the gaze direction and the depth data, and output the location. | 12-25-2014 |
20140375542 | ADJUSTING A NEAR-EYE DISPLAY DEVICE - Embodiments are disclosed herein that relate to aligning a near-eye display of a near-eye display device with an eye of a user. For example, one disclosed embodiment provides, on a near-eye display device, a method comprising receiving an image of an eye from a camera via a reverse display optical path, detecting a location of the eye in the image, and determining a relative position of the eye with regard to a target viewing position of the near-eye display. The method further comprises determining an adjustment to make to the near-eye display device to align the location of the eye with the target viewing position. | 12-25-2014 |
20140375543 | SHARED COGNITION - A system includes at least one sensor, at least one display, and a computing device coupled to the at least one sensor and the at least one display. The computing device includes a processor, and a computer-readable storage media having computer-executable instructions embodied thereon. When executed by at least one processor, the computer-executable instructions cause the processor to receive information from at least a first occupant, identify an object based at least partially on the received information, and present, on the at least one display, a first image associated with the object to a second occupant. The first image is aligned substantially between an eye position of the second occupant and the object such that the at least one display appears to one of overlay the first image over the object and position the first image adjacent to the object with respect to the eye position of the second occupant. | 12-25-2014 |
20140375544 | USER INTERFACE NAVIGATION - Embodiments that relate to navigating a hierarchy of visual elements are disclosed. In one embodiment a method includes presenting one or more visual elements from a two-dimensional plane via a display device. A home location within a viewable region of the display is established. A proportional size relationship between each element and each of the other elements is established. Using gaze tracking data, a gaze location at which a user is gazing within the viewable region is determined. The gaze location is mapped to a target location, and movement of the target location toward the home location is initiated. As the target location moves closer to the home location, each of the visual elements is progressively enlarged while the proportional size relationship between each of the visual elements and each of the other visual elements is also maintained. | 12-25-2014 |
20140375545 | ADAPTIVE EVENT RECOGNITION - A system and related methods for adaptive event recognition are provided. In one example, a selected sensor of a head-mounted display device is operated at a first polling rate corresponding to a higher potential latency. Initial user-related information is received. Where the initial user-related information matches a pre-event, the selected sensor is operated at a second polling rate faster than the first polling rate and corresponding to a lower potential latency. Subsequent user-related information is received. Where the subsequent user-related information matches a selected target event, feedback associated with the selected target event is provided to the user via the head-mounted display device. | 12-25-2014 |
20140375546 | STORAGE MEDIUM HAVING STORED THEREIN INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, AND METHOD OF CALCULATING DESIGNATED POSITION - An example of an information processing apparatus calculates a first position using imaging information based on an image captured by an imaging device. The information processing apparatus calculates a second position using a detection result of a sensor. The information processing apparatus repeatedly determines whether or not a predetermined condition is satisfied, the predetermined condition including a condition regarding a relationship between the first position and the second position. A designated position on a predetermined plane is calculated using the first position and/or the second position. In a satisfaction period after the predetermined condition has been satisfied, the information processing apparatus calculates the designated position by making an influence of the second position relatively great as compared to an unsatisfaction period before the predetermined condition is satisfied. | 12-25-2014 |
20140375547 | TOUCH FREE USER INTERFACE - The presently disclosed subject matter includes a method of recognizing an aimed point on a plane. Images captured by one or more image sensor are processed for obtaining data obtaining data indicative of location of at least one pointing element in the viewing space and data indicative of at least one predefined user's body part in the viewing space; using the obtained data an aimed point on the plane is identified. In case it is determined that a predefined condition is met a predefined command and/or message is executed. | 12-25-2014 |
20140375548 | METHOD OF PROCESSING INFORMATION AND ELECTRONIC DEVICE - This application discloses a method of processing information and two electronic device, the method comprises: obtaining first gesture information of a first body when the first body is plugged into a second body, wherein the first gesture information is used to indicate a first direction of the first body; obtaining second gesture information of the second body, wherein the second gesture information is used to indicate a second direction of the second body; and determining, based on the first gesture information and the second gesture information, a plugging mode in which the first body is plugged into the second body, to obtain a first determination result, wherein the plugging mode is a normal plugging mode for indicating that a first front face is opposite to a second front face after the first body is plugged into the second body. | 12-25-2014 |
20140375549 | Method and Apparatuses for Determining a User Attention Level Using Facial Orientation Detection - Various methods are provided for facilitating the determination of a user attention level using facial orientation detection. One example method may include receiving an image of a user after an expiration of an inactivity timer on the user terminal. The current attention level of the user is determined based on the received image. The computing device lock condition is then set based on the determined current attention level of the user. Similar and related example apparatuses and example computer program products are also provided. | 12-25-2014 |
20140375550 | GENERATING AN OPERATING PROCEDURE MANUAL - A device generates an operating procedure manual for software including a captured image of a screen displayed by the software. An image acquiring hardware unit acquires a plurality of captured images of a plurality of screens displayed by software in response to a plurality of operations with respect to the software. A dividing hardware unit divides the plurality of captured images into a plurality of captured image groups, to each of which at least one captured image acquired in response to at least one operation constituting a meaningful chunk belongs. A generating hardware unit generates an operating procedure manual including, for each captured image group, a captured image belonging to that captured image group. | 12-25-2014 |
20140375551 | PORTABLE DISPLAY DEVICE AND OPERATION DETECTING METHOD - A portable display device includes: an acceleration sensor; an outputting section outputting a detection signal when a measurement value changes across a reference value; a calculating section calculating a duration time in which the measurement value exceeds the reference value; a judging section judging whether the movement state is due to running based on an occurrence frequency of the movement state and the duration time; a setting section setting a running state when the movement state is due to the running; a calculating section calculating a time interval between the movement states; a continuous tap judging section judging whether the movement states are due to continuous tap operations according to a judgment condition based on the duration time and time interval; and a setting range changing section setting a range of the judgment condition in the running state to be broader than that in a non-running state. | 12-25-2014 |
20140375552 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND STORAGE MEDIUM - There is provided an information processing apparatus including: a determining unit that determines that an apparatus worn on a user is in a reference state when an angle of a display unit of the apparatus with respect to a predetermined direction is within a range of a threshold value; a recognizing unit that recognizes a motion of the apparatus from the reference state when the determining unit determines that the apparatus is in the reference state; an input unit that receives an input operation corresponding to the motion of the apparatus from the reference state recognized by the recognizing unit; a time measuring unit that measures a time; a detecting unit that detects a motion state of the apparatus; and a setting unit that sets the threshold value according to the time measured and the motion state of the apparatus detected by the detecting unit. | 12-25-2014 |
20140375553 | Method and device for determining gestures in the beam region of a projector - A method for determining gestures in the beam region of a projector includes: projecting an image onto a surface with the aid of the projector, using light; measuring a first set of light intensities of light backscattered from the direction of the surface, under the influence of a gesture made in the beam region of the projector; assigning the measured light intensities to, in each instance, a position of the image projected by the projector; at a first time point, generating a first light intensity function over a second set of positions, which are assigned measured light intensities; at a second time point, generating at least one second light intensity function over the second set of positions; and determining the gesture made, based on the result of a comparison between the first and second light intensity functions. | 12-25-2014 |
20140375554 | METHOD AND SYSTEM FOR PRESENTING AT LEAST ONE IMAGE OF AT LEAST ONE APPLICATION ON A DISPLAY DEVICE - The present invention comprises a method for presenting at least one image of at least one application on a display device, comprising the following steps: a) ( | 12-25-2014 |
20140375555 | WORK SUPPORT SYSTEM, WORK SUPPORT APPARATUS, WORK SUPPORT METHOD, AND COMPUTER READABLE STORAGE MEDIUM - An imaging unit ( | 12-25-2014 |
20140375556 | POSITION DETECTING SYSTEM AND POSITION DETECTING PROGRAM - A plurality of positional information transmitters, each of which transmits positional information to which a plurality of unit spaces are each uniquely assigned, are disposed on a ceiling. A mobile terminal receives the positional information transmitted from each of the positional information transmitters and changes a terminal-side image in accordance with the received positional information. Furthermore, the mobile terminal transmits to a server current positional information and operation information indicating an instruction inputted by a user. The server changes, in accordance with the current positional information and the operation information received by the mobile terminal, a floor image displayed on a plurality of floor displays disposed on a floor. Thus, it becomes possible to provide a highly interesting and novel position detecting system. | 12-25-2014 |
20140375557 | HUMAN TRACKING SYSTEM - An image such as a depth image of a scene may be received, observed, or captured by a device. A grid of voxels may then be generated based on the depth image such that the depth image may be downsampled. A background included in the grid of voxels may also be removed to isolate one or more voxels associated with a foreground object such as a human target. A location or position of one or more extremities of the isolated human target may be determined and a model may be adjusted based on the location or position of the one or more extremities. | 12-25-2014 |
20140375558 | SYSTEMS AND METHODS FOR AUTOMATICALLY ADJUSTING AUDIO BASED ON GAZE POINT - Embodiments provide methods and systems for adjusting audio output based on eye tracking input. In some embodiments, a memory stores data defining a boundary based on a coordinate system. The boundary corresponds to a display element of displayed content. An input receives data indicating coordinates of a gaze point location of a user viewing the displayed content. A processor compares the received coordinates of the gaze point location to the boundary corresponding to the display element to determine whether the gaze point location is inside the boundary corresponding to the display element. In response to determining that the gaze point location is inside the boundary corresponding to the display element, the processor adjusts an audio setting of the displayed content. | 12-25-2014 |
20140375559 | DISPLAY DEVICE AND DISPLAY METHOD THAT DETERMINES INTENTION OR STATUS OF A USER - The present invention provides a display apparatus and a display method for realizing control for display operations by a user precisely reflecting the user's status, i.e., the user's intentions, visual state and physical conditions. Worn as an eyeglass-like or head-mount wearable unit for example, the display apparatus of the present invention enables the user to recognize visibly various images on the display unit positioned in front of the user's eyes thereby providing the picked up images, reproduced images, and received images. As control for various display operations such as switching between the display state and the see-through state, display operation mode and selecting sources, the display apparatus of the present invention acquires information about either behavior or physical status of the user, and determines either intention or status of the user in accordance with the acquired information, thereby controlling the display operation appropriately on the basis of the determination result. | 12-25-2014 |
20150009116 | FACILITATING GESTURE-BASED ASSOCIATION OF MULTIPLE DEVICES - Gesture-based association of multiple devices may be facilitated, which may effectuate presentation of content related to associated devices. Gesture detection may reduce or completely eliminate any ambiguity of whether the association is wanted or not. Some implementations may be employed in a theme park in which park guests wear an RFID wristband. During a park visit, when merchandise goods with embedded RFIDs are purchased (e.g., toys and/or other merchandise), the merchandise may be associated with the wristband and, therefore, the park guest wearing the wristband. Such association may give the theme park environment and/or the toy itself a mechanism to adapt their behavior depending on which toy the park guest is carrying. This may contribute to richer storytelling and personalized experiences, which may be realized by using location information, knowledge about the park guest's preferences or previous park visits, favorite attractions, age, language-specific information, and/or other information. | 01-08-2015 |
20150009117 | DYNAMIC EYE TRACKCING DATA REPRESENTATION - Disclosed are a system, method, and article of manufacture of a dynamic eye-tracking data representation. A first eye-tracking data of a first environmental attribute of a mobile device can be obtained. A second eye-tracking data of a second environmental attribute of the mobile device can be obtained. A tag cloud comprising a first component that describes a first relationship between the first eye-tracking data and the first environmental attributes and a second component that describes a second relationship between the second eye-tracking data and the second environmental attribute can be generated. | 01-08-2015 |
20150009118 | INTELLIGENT PAGE TURNER AND SCROLLER - Provided is a method for changing an image on a display. The method, in one embodiment, includes providing a first image on a display. The method, in this embodiment, further includes tracking a movement of a user's facial feature as it relates to the first image on the display, and generating a command to provide a second different image on the display based upon the tracking. | 01-08-2015 |
20150009119 | Built-in design of camera system for imaging and gesture processing applications - Systems and method are disclosed for enabling a user to interact with gestures in a natural way with image(s) displayed on the surface of an integrated monitor whose display contents are governed by an appliance, perhaps a PC, smart phone or tablet. Some embodiments include the display as well as the appliance, in a single package such as all-in-one computers. User interaction includes gestures that may occur within a three-dimensional hover zone spaced apart from the display surface. | 01-08-2015 |
20150009120 | GESTURE-SENSITIVE DISPLAY - Disclosed are a system and method for detecting a gesture performed by a user of a device. The device includes a screen having a backlight as with a liquid-crystal type display or which provides its own illumination as with a light-emitting diode type display. The device is programmed to emit a detectable optical signal from one or more distinct zones of the display. The device further includes an optical receiver for detecting any reflections of the emitted detectable optical signal. When a user's hand is located in proximity to the device display, the reflections of the detectable optical signal from that appendage are detected by the optical receiver and are used by the device to determine the presence and direction of travel of the user hand, signifying a user gesture. The distinct zones of the backlight may consist of a single zone, and the optical receiver may comprise multiple receivers. | 01-08-2015 |
20150009121 | DEVICE AND METHOD OF AUTOMATIC DISPLAY BRIGHTNESS CONTROL - A device and a method of automatic display brightness control are provided. Human fatigue index and environmental factors for dry eyes are combined as an indicator to dynamically adjust the display brightness. Heart rate variability (HRV) is used as the human fatigue index, and the environmental CO | 01-08-2015 |
20150009122 | ELECTRONIC DEVICE HAVING DISPLAY DEVICE FOR SYNC BRIGHTNESS CONTROL AND OPERATING METHOD THEREOF - An electronic device having a display device for sync brightness control includes a brightness key, an embedded controller, a storage unit, a processing unit, a peripheral controlling circuit and BIOS. The storage unit stores an operating system and a brightness application. The processing unit executes the brightness control application in the operating system. The peripheral controlling circuit receives a request from the brightness key. The BIOS acquires the request from the peripheral controlling circuit and send an event to notify a brightness user interface of the operating system, so that the brightness user interface notifies a driver in the operating system and sends a brightness adjusting command to the brightness control application according to the event. The brightness control application notifies the embedded controller to adjust the brightness of the display device according to the brightness adjusting command. | 01-08-2015 |
20150009123 | DISPLAY APPARATUS AND CONTROL METHOD FOR ADJUSTING THE EYES OF A PHOTOGRAPHED USER - A display apparatus is provided. The display apparatus includes a photographing unit configured to photograph the shape of a face; a detector is configured to detect a direction and angle of the face shape; a transformer is configured to mix the photographed face shape and a 3D face model and to transform the mixed face shape by using the detected direction and angle of the face shape; and an output interface is configured to output the transformed face shape. | 01-08-2015 |
20150009124 | GESTURE BASED USER INTERFACE - A system and method for recognition of hand gesture in computing devices. The system recognizes a hand of a user by identifying a predefined first gesture and further collects visual information related to the hand identified on the basis of the first predefined gesture. The visual information is used to extract a second gesture (and all other gestures after the second) from the video/image captured by the camera and finally interpreting the second gesture as a user input to the computing device. The system enables gesture recognition in various light conditions and can be operated by various user hands including the ones wearing gloves. | 01-08-2015 |
20150009125 | MOBILE TERMINAL AND CONTROL METHOD FOR THE MOBILE TERMINAL - A mobile terminal and corresponding method, the mobile terminal including a flexible display configured to be warped between configurations having different radiuses of curvature, a sensor configured to sense the warpage, and a controller configured to display first screen information in a first configuration, and display a first image object corresponding to the first screen information when the flexible display is warped from the first configuration to a second configuration. When a different application is driven, a second image object corresponding to the different application is displayed along with the first image object, in the second configuration, and when the flexible display is warped again from the second configuration to the first configuration, screen information corresponding to part of the first image object and the second image object is displayed. | 01-08-2015 |
20150009126 | ADJUSTING A TRANSPARENT DISPLAY WITH AN IMAGE CAPTURING DEVICE - A system and method for adjusting a transparent display an image capturing device is provided. The system includes an image receiving module to receive an image from the image capturing device, the image capturing device being situated on an opposing side in which the transparent display presents content; an interfacing module to interface with the transparent display; an analysis module to analyze the received image with the interfacing module to perform an analysis; and an output module to perform a display modification or an error indication based on the analysis. | 01-08-2015 |
20150009127 | DEFORMABLE USER INTERFACE DEVICES, APPARATUS, AND METHODS - User interface devices using magnetic sensing to provide output signals associated with motion and/or deformation of an actuator element of the interface devices are described. The output signals may be provided to an electronic computing system to provide commands, controls, and/or other data or information. In one embodiment, a user interface device may include a plurality of permanent magnets and a plurality of multi-axis magnetic sensors to generate motion and/or deformation signals to be provided to a processing element to generate the output signals. | 01-08-2015 |
20150009128 | DATA PROCESSING DEVICE - To provide a highly browsable data processing device or a highly portable data processing device, a data processing device including the following is devised: an input/output unit provided with a display portion which can be folded and unfolded and a sensor portion that can sense the folded and unfolded states of the display portion and can supply data on fold, and an arithmetic unit that stores a program for executing different processing depending on the data on fold. | 01-08-2015 |
20150009129 | METHOD FOR OPERATING PANORAMA IMAGE AND ELECTRONIC DEVICE THEREOF - A method for operating an electronic device is provided, which includes detecting bending of a flexible display, determining bending information corresponding to the bending, obtaining a plurality of images according to the detected bending information, and generating a panorama image by combining the obtained plurality of images. Thus, an intuitive user interface using the flexible display can be provided. | 01-08-2015 |
20150009130 | Three Dimensional User Interface Effects On A Display - The techniques disclosed herein may use various sensors to infer a frame of reference for a hand-held device. In fact, with various inertial clues from accelerometer, gyrometer, and other instruments that report their states in real time, it is possible to track a Frenet frame of the device in real time to provide an instantaneous (or continuous) 3D frame-of-reference. In addition to—or in place of—calculating this instantaneous (or continuous) frame of reference, the position of a user's head may either be inferred or calculated directly by using one or more of a device's optical sensors, e.g., an optical camera, infrared camera, laser, etc. With knowledge of the 3D frame-of-reference for the display and/or knowledge of the position of the user's head, more realistic virtual 3D depictions of the graphical objects on the device's display may be created—and interacted with—by the user. | 01-08-2015 |
20150009131 | System for Determining Three-Dimensional Position of Transmission Device Relative to Detecting Device - An objective of the present invention is to provide a system for determining a three-dimensional location of an emitting apparatus with respect to a detecting apparatus. The system comprising: an emitting apparatus comprising a light-emitting source for sending a control signal and an optical unit for facilitating transmitting the control signal; a detecting apparatus comprising a camera, for obtaining imaging information of the control signal in the camera via the optical unit; a computing apparatus for determining three-dimensional location information of the emitting apparatus with respect to the detecting apparatus according to the imaging information. Compared with the prior art, the present invention, through providing in the emitting apparatus an optical unit for facilitating transmitting the control signal, realizes determining the three-dimensional location of the emitting apparatus with respect to the detecting apparatus, which not only reduces the configuration costs and lowers the energy consumption level, but also makes three-dimensional location information-based control feasible, thereby further enhancing control efficiency and improving user manipulation experience. | 01-08-2015 |
20150009132 | HEAD-MOUNTED DISPLAY, PROGRAM FOR CONTROLLING HEAD-MOUNTED DISPLAY, AND METHOD OF CONTROLLING HEAD-MOUNTED DISPLAY - A head-mounted display comprising a display and a detector. The detector is configured to detect a direction of at least one of the head-mounted display and a line of vision. The display is configured to display an output image of an application, wherein the application is selected based on the direction being outside a range of a front direction. | 01-08-2015 |
20150009133 | INTEGRATED PROCESSOR FOR 3D MAPPING - A device for processing data includes a first input port for receiving color image data from a first image sensor and a second input port for receiving depth-related image data from a second image sensor. Processing circuitry generates a depth map using the depth-related image data. At least one output port conveys the depth map and the color image data to a host computer. | 01-08-2015 |
20150009134 | INFORMATION DISPLAY APPARATUS, IMAGE TAKING APPARATUS, AND METHOD AND COMPUTER PROGRAM FOR CONTROLLING DISPLAYING INFORMATION - An information display apparatus includes an input device adapted to input user operation information, a touch sensor adapted to detect the state of the input device in terms of whether the input device is touched by a user, a display adapted to display information, and a control unit adapted to receive user operation information from the input device and sensor detection information from the touch sensor, and control displaying of the information on the display in accordance with the received user operation information and the sensor detection information. If the touch information indicating that the input device is touched by the user is received from the touch sensor, the control unit displays first information associated with the input device on the display. If the input device is maintained in the touched state longer than a predetermined period, the control unit switches the displayed information into second information. | 01-08-2015 |
20150009135 | GESTURE RECOGNIZER SYSTEM ARCHITECTURE - Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data. | 01-08-2015 |
20150015475 | MULTI-FUNCTION INPUT DEVICE - An electronic device includes a surface and a multi-function input device. The multi-function input device is operable in at least a first mode and a second mode. In the first mode, an input portion of the multi-function input device is flush with the surface or recessed in the surface and is operable to receive z axis press input data. In the second mode, the input portion is positioned proud of the surface (i.e., project from the surface) and is operable to receive x axis input data and/or y axis input data. The input portion may also be operable to receive z axis input data in the second mode. In one example, the multi-function input device may have a button mode and a joystick mode. | 01-15-2015 |
20150015476 | HANDHELD COMPUTING PLATFORM WITH INTEGRATED PRESSURE SENSOR AND ASSOCIATED METHODS OF USE - Receipt of user input may be facilitated using a handheld computing platform with an integrated pressure sensor. Exemplary implementations may allow a user to simply squeeze the handheld computing platform in order to control one or more aspects of operation of the handheld computing platform. That is, instead of tapping a touch screen or pressing buttons, a user may merely need to apply compressive pressure to the handheld computing platform by applying opposing forces to opposing surfaces of the handheld computing platform. By way of non-limiting example, the one or more aspects of operation controlled by squeezing the handheld computing platform may include one or more of input of a value, selection from a menu, manipulation of a virtual object, entry of a password or a code, interaction with a fitness program, interaction with a rehabilitation and/or medical treatment program, interaction with a game, and/or other aspects of operation. | 01-15-2015 |
20150015477 | Multi-Sensor Hand Detection - In one embodiment, a method includes receiving real-time sensor data from a number of sensors of different sensor types on a computing device. The real-time sensor data corresponds to a transition in a physical state of the computing device caused by a user of the computing device. The method also includes correlating the real-time sensor data from the number of sensors of different sensor types on the computing device; determining based on the correlation an intended imminent use of the computing device by the user; and automatically initiating based on the determination a pre-determined function of the computing device. | 01-15-2015 |
20150015478 | IR EMISSIVE DISPLAY FACILITATING REMOTE EYE TRACKING - An infrared (IR) emissive display device includes a display panel, an IR sensor, and a controller. The display panel includes IR pixels configured to emit IR light and arranged in a first two-dimensional (2D) pattern. The IR sensor is configured to sense IR signals emitted from the IR pixels and reflected off a user of the display device. The controller is configured to control the IR pixels, the IR signals, and the IR sensor to detect a gaze direction of an eye of the user. A method of facilitating remote eye tracking of the user on the display device includes emitting the IR signals from the IR pixels toward the user, sensing the IR signals reflected off the user with the IR sensor, and detecting the gaze direction of the user's eye from the sensed IR signals. | 01-15-2015 |
20150015479 | MOBILE TERMINAL AND CONTROL METHOD THEREOF - A mobile terminal including a terminal body; a display unit; a wireless communication unit configured to be wirelessly connected to an in-vehicle video display apparatus; and a controller configured to detect a directional change of a driver's eyes from a first display to a second display included in the in-vehicle video display apparatus, and control the second display to display at least a portion of a first screen displayed on the first display, in response to the detected directional change of the driver's eyes from the first display to the second display. | 01-15-2015 |
20150015480 | GESTURE PRE-PROCESSING OF VIDEO STREAM USING A MARKERED REGION - Techniques are disclosed for processing a video stream to reduce platform power by employing a stepped and distributed pipeline process, wherein CPU-intensive processing is selectively performed. The techniques are particularly well-suited for hand-based navigational gesture processing. In one example case, for instance, the techniques are implemented in a computer system wherein initial threshold detection (image disturbance) and optionally user presence (hand image) processing components are proximate to or within the system's camera, and the camera is located in or proximate to the system's primary display. In some cases, image processing and communication of pixel information between various processing stages which lies outside a markered region is suppressed. In some embodiments, the markered region is aligned with, a mouse pad or designated desk area or a user input device such as a keyboard. Pixels evaluated by the system can be limited to a subset of the markered region. | 01-15-2015 |
20150015481 | Gesture Recognition Systems - A system including a first radiation source providing a first beam and a second radiation source providing a second beam, and a radiation sensor, wherein the first beam does not overlap the second beam. In some embodiments, the radiation comprises infrared radiation. A gesture recognition system including at least one infrared sensor, a first infrared light emitting diode (LED) providing a first far-field radiation beam that extends from the first infrared LED and defines a first central ray, a second infrared light emitting diode (LED) providing a second far-field radiation beam that extends from the second infrared LED and defines a second central ray, wherein the first central ray and the second central ray define a single intersection point and an angle of intersection. | 01-15-2015 |
20150015482 | LOW INTERFERENCE SYSTEM AND METHOD FOR SYNCHRONIZATION, IDENTIFICATION AND TRACKING OF VISUAL AND INTERACTIVE SYSTEMS - A visual system includes one or more units, where each unit is a displaying unit or a video camera, or a combination of both, and each unit has a receiver. The unit has a transmitter which starts transmitting infrequent, short repetitive bursts if no bursts are received by the receiver; and otherwise transmits a delayed short burst synchronized to the infrequent, short repetitive bursts by the transmitter, in order to synchronize all units in the visual system to each other, while the infrequent, short repetitive bursts will not interfere nor will be interfered by the general use of remote controllers for other purposes. | 01-15-2015 |
20150015483 | METHOD OF CONTROLLING AT LEAST ONE FUNCTION OF DEVICE BY USING EYE ACTION AND DEVICE FOR PERFORMING THE METHOD - A method of controlling a device by an eye action is provided. The method includes selecting a controller to execute a control command generated by the eye action of a user, obtaining data about the eye action of the user, detecting the control command corresponding to the eye action of the user based on the obtained data, and executing the detected control command by using the selected controller. | 01-15-2015 |
20150015484 | DISPLAY DEVICE - A portable display device includes a display unit, a first capturing unit, and a second capturing unit. The display unit includes a rectangular display screen for displaying an image. The first capturing unit is configured to capture an image of an object. The first capturing unit is arranged in a region, corresponding to a first side of the display screen, which is a part of a peripheral region of the display unit other than the display screen. The second capturing unit is configured to capture an image of the object. The second capturing unit is arranged in a region, corresponding to a second side adjacent to the first side, which is a part or the peripheral region. | 01-15-2015 |
20150015485 | Calibrating Vision Systems - Methods, systems, and computer program calibrate a vision system. An image of a human gesture is received that frames a display device. A boundary defined by the human gesture is computed, and gesture area defined by the boundary is also computed. The gesture area is then mapped to pixels in the display device. | 01-15-2015 |
20150015486 | HANDHELD DEVICE FOR SPECTATOR VIEWING OF AN INTERACTIVE APPLICATION - A handheld device is provided, comprising: a sensor configured to generate sensor data for determining and tracking a position and orientation of the handheld device during an interactive session of an interactive application presented on a main display, the interactive session being defined for interactivity between a user and the interactive application; a communications module configured to send the sensor data to a computing device, the communications module being further configured to receive from the computing device a spectator video stream of the interactive session that is generated based on a state of the interactive application and the tracked position and orientation of the handheld device, the state of the interactive application being determined based on the interactivity between the user and the interactive application; a display configured to render the spectator video stream. | 01-15-2015 |
20150015487 | INTERACTION WITH AN EXPANDED INFORMATION SPACE ON A CELLULAR PHONE - A method for providing a dynamic perspective-based presentation of content on a cellular phone is provided, comprising: presenting a first portion of a content space on a display screen of the cellular phone; tracking a location of a user's head in front of the display screen; detecting a lateral movement of the user's head relative to the display screen; progressively exposing an adjacent second portion of the content space, from an edge of the display screen opposite a direction of the lateral movement, in proportional response to the lateral movement of the user's head relative to the display screen. | 01-15-2015 |
20150022431 | STRESS TOLERANT MEMS ACCELEROMETER - A Micro-Electro-Mechanical System (MEMS) accelerometer employing a rotor and stator that are both released from a substrate. In embodiments, the rotor and stator are each of continuous a metal thin film. A stress gradient in the film is manifested in capacitive members of the rotor and stator as a substantially equal deflection such that a relative displacement between the rotor and stator associated with an acceleration in the z-axis is substantially independent of the stress gradient. In embodiments, the stator comprises comb fingers cantilevered from a first anchor point while the rotor comprises comb fingers coupled to a proof mass by torsion springs affixed to the substrate at second anchor points proximate to the first anchor point. | 01-22-2015 |
20150022432 | SPECIAL GESTURES FOR CAMERA CONTROL AND IMAGE PROCESSING OPERATIONS - An embodiment provides a method, including: receiving at a sensor of an information handling device image input; determining, using one or more processors, a predetermined gesture within the image input; capturing additional gesture input associated with the predetermined gesture; and processing an image rendered on a display device according to the additional gesture input associated with the predetermined gesture. Other embodiments are described and claimed. | 01-22-2015 |
20150022433 | DISPLAY CONTROL IN A DATA PROCESSING DEVICE BASED ON SENSING DEVIATION THEREOF FROM A REFERENCE POSITION - A method includes sensing, through a processor of a data processing device in conjunction with a motion sensor, a deviation of the data processing device from a reference position thereof. The method also includes modifying, through the processor, a display parameter of a display unit of the data processing device and/or display data to be rendered on the display unit in accordance with the sensed deviation. | 01-22-2015 |
20150022434 | Movement-Triggered Action for Mobile Device - In one embodiment, a method includes, by a computing device, receiving sensor data from a sensor on the computing device indicating physical movement of the computing device. The method also includes determining a motion-trigger signal of the sensor data corresponding to a first characteristic of the physical movement of the computing device and a motion-confirm signal of the sensor data corresponding to a second characteristic of the physical movement of the computing device. The method further includes determining whether the motion-trigger signal includes a transition from within a pre-defined threshold band to outside of the pre-defined threshold band and whether the motion-confirm signal is within the pre-defined threshold band. The method also includes initiating a pre-defined action of the computing device when the motion-trigger signal includes the transition from within the pre-defined threshold band to outside the pre-defined threshold band and the motion-confirm signal is within the pre-defined threshold band. | 01-22-2015 |
20150022435 | GAZE-TRACKING EYE ILLUMINATION FROM DISPLAY - A method to drive a pixelated display of an electronic device arranged in sight of a user of the device. The method includes receiving a signal that encodes a display image, and controlling the pixelated display based on the signal to form the display image in addition to a latent image, the latent image being configured to illuminate an eye of the user with light of such characteristics as to be unnoticed by the user, but to reveal an orientation of the eye on reflection into a machine-vision system. | 01-22-2015 |
20150022436 | FOLDABLE DISPLAY DEVICE PROVIDING IMAGE LAYER AND METHOD OF CONTROLLING THE SAME - A method of controlling a foldable display device according to one embodiment of the present specification can include the steps of displaying a first image in a first area, a second image in a second area, and a third image in a third area in a mode that a foldable display device is not folded, wherein the first area is positioned at the center of the foldable display device, the second area is positioned at the left of the first area, and the third area is positioned at the right of the first area, detecting that the second area or the third area of the foldable display device is folded, if the second area is folded, assigning the first image to a first image layer of the first area and assigning the second image of the folded second area to a second image layer of the first area. | 01-22-2015 |
20150022437 | Body Mind Machine Interface and Method - A body mind machine interface has a neural interface grid operably printed or implanted in or on the user, and a sensor pad for receiving conducted electrical currents from the neural interface grid and generating an electrical signal that is transmitted to an interface computer. The interface computer has a computer processor and a computer memory, and an interface program operably installed on the computer memory for processing the electrical signal from the sensor pad and determining a direction from the brain. | 01-22-2015 |
20150022438 | WATCH TYPE MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME - There is disclosed a watch type mobile terminal including a display configured to display a first image information, a case of which a front surface the display is coupled to, a strap connected to the case and securely wound around a user's arm, a position sensor configured to sense inclination of the case and variation of a position of the case, and a controller configured to convert a state of the display into an activated state from a deactivated state, when position variation corresponding to a preset activating gesture is sensed by the position sensor, such that the screen of the watch type mobile terminal may be activated without the user's pressing an auxiliary button and that use convenience may be enhanced accordingly. | 01-22-2015 |
20150022439 | METHOD AND APPARATUS FOR SELECTING BETWEEN MULTIPLE GESTURE RECOGNITION SYSTEMS - A method and apparatus for selecting between multiple gesture recognition systems includes an electronic device determining a context of operation for the electronic device that affects a gesture recognition function performed by the electronic device. The electronic device also selects, based on the context of operation, one of a plurality of gesture recognition systems in the electronic device as an active gesture recognition system for receiving gesturing input to perform the gesture recognition function, wherein the plurality of gesture recognition systems comprises an image-based gesture recognition system and a non-image-based gesture recognition system. | 01-22-2015 |
20150022440 | DISPLAY AND METHOD OF DISPLAYING THREE-DIMENSIONAL IMAGES WITH DIFFERENT PARALLAXES - A display includes multiple pixels, a detecting device and an optical unit. Each of the pixels is configured to display a first image. The detecting device is configured to detect a position of a viewer to generate a position data. The optical unit cooperates with each of the pixels to project the first image to multiple viewable zones, in which an unobserved zone is formed between consecutive two of the viewable zones. Each of the pixels is configured to switch from a first image to a second image while the position data corresponds to the viewer located in the unobserved zone. | 01-22-2015 |
20150022441 | METHOD AND APPARATUS FOR DETECTING INTERFACING REGION IN DEPTH IMAGE - An apparatus for detecting an interfacing region in a depth image detects the interfacing region based on a depth of a first region and a depth of a second region which is an external region of the first region in a depth image. | 01-22-2015 |
20150022442 | ELECTRONIC DEVICE HAVING HINGE STRUCTURE AND OPERATING METHOD THEREOF - An electronic device is provided, which includes a first case frame, a second case frame, a hinge portion, and a processor. The first case frame includes a keypad. The second case frame is installed to be rotatable from the first case frame, and includes a touchscreen. The hinge portion couples the first case frame with the second case frame, and is rotatably installed to rotate the second case frame. The processor controls to determine an operation mode of the electronic device depending on rotation of the hinge portion. | 01-22-2015 |
20150022443 | Process and Apparatus for Haptic Interaction with Visually Presented Data - The invention relates to a system and a method for haptic interaction with visually presented objects. In the process, three-dimensional data of the user and an object are captured and presented in a visual subsystem. At the same time, there is an interaction of the user with a haptic element in a haptic subsystem, wherein the haptic element is designed in such a way that it can imitate the surface characteristics of the object in the collision area of the hand and object. | 01-22-2015 |
20150022444 | INFORMATION PROCESSING APPARATUS, AND INFORMATION PROCESSING METHOD - An information processing apparatus includes processing circuitry that is configured to issue a control command relating to a real object based on a displayed positional relationship between the real object and a predetermined object. The real object being a tangible, movable object and the predetermined object being at least one of another real object and a virtual object. | 01-22-2015 |
20150022445 | FLEXIBLE DISPLAY DEVICE AND METHOD OF CONTROLLING FLEXIBLE DISPLAY DEVICE - A display device includes a flexible substrate, a display unit including multiple light-emitting elements arranged at the substrate and configured to display an image according to an image signal, a displacement sensor provided to at least one of a front surface and a back surface of the substrate and configured to detect a curved state of the substrate, and a control unit configured to execute a control by which the image is split and displayed in the display unit when a curve of the substrate is detected by the displacement sensor. | 01-22-2015 |
20150029085 | Apparatus and Method Pertaining to the Use of a Plurality of 3D Gesture Sensors to Detect 3D Gestures - A device having at least two 3-dimensional gesture sensors that employ differing gesture-sensing modalities as compared to one another further includes a control circuit that operably couples to both of these 3-dimensional gesture sensors and employs both to detect three-dimensional gestures. By one approach the control circuit employs both sensors in a temporally-overlapping manner to reliably and accurately detect the 3D gesture. As another illustrative example, the control circuit may employ different sensors during different portions of a given 3D gesture to detect those corresponding portions of the 3D gesture. | 01-29-2015 |
20150029086 | BACKLIGHT FOR TOUCHLESS GESTURE DETECTION - A device and method to detect a gesture performed by an object in touch-less communication with the device are described. The device includes two or more ambient light sensors arranged at respective first surface locations of the device, each of the two or more ambient light sensors sensing light intensity at the respective first surface location. The device also includes one or more processors to control one or more light sources at respective second surface locations of the device based on the light intensity sensed by the two or more ambient light sensors, calibrate the two or more ambient light sensors based on the one or more light sources, and detect the gesture based on the light intensity sensed by each of the two or more ambient light sensors. | 01-29-2015 |
20150029087 | METHODS AND SYSTEMS FOR ADJUSTING POWER CONSUMPTION IN A USER DEVICE BASED ON BRAIN ACTIVITY - Methods and systems are disclosed herein for adjusting power consumption and/or sensitivity levels of a user device based on the current brain activity of a user. For example, a media guidance application implemented on a user device may monitor brain activity of a user. In response to determining particular brain activity (e.g., associated with the inactivity of the user), the media guidance application may initiate a stand-by mode in order to reduce power consumption. | 01-29-2015 |
20150029088 | HEAD MOUNTED DISPLAY AND METHOD OF CONTROLLING THEREFOR - A method of controlling a head mounted display (HMD) according to one embodiment of the present specification includes the steps of detecting a first contact contacted with a nose pad of the HMD, detecting a second contact contacted with a temple of the HMD, if the first contact and the second contact are detected, operating in a wearing mode to activate a display unit and an audio input unit, if the first contact is not detected and the second contact is detected, operating in a mute mode to activate the display unit and inactivate the audio input unit, and if the first contact and the second contact are not detected, operating in a non-wearing mode to inactivate the display unit and the audio input unit. | 01-29-2015 |
20150029089 | DISPLAY APPARATUS AND METHOD FOR PROVIDING PERSONALIZED SERVICE THEREOF - A display apparatus is provided, which includes a storage configured to store voice patterns of at least one user and personalization information set with respect to at least one user, a receiver configured to receive a user voice, and a controller configured to detect the personalization information that corresponds to the voice pattern of the user voice from the storage and to control a function of the display apparatus according to the detected personalization information. | 01-29-2015 |
20150029090 | CHARACTER INPUT METHOD AND DISPLAY APPARATUS - A display apparatus is provided. The display apparatus includes: a display configured to display a virtual keyboard; an inputter configured to receive a stroke input on a key on the virtual keyboard; and a controller configured to display a character which corresponds to the key in an input window in response to the stroke input being received, and configured to perform a control operation to suggest at least one character that is likely to follow a character which corresponds to the key and display the at least one suggested character. The at least one suggested character is displayed so as not to overlap a character of a basic key on the virtual keyboard. | 01-29-2015 |
20150029091 | INFORMATION PRESENTATION APPARATUS AND INFORMATION PROCESSING SYSTEM - An information presentation apparatus includes a main body, a detection unit, and a presentation unit. The main body is mounted on a head portion of a user. The detection unit is disposed on a position intersecting a median plane of the user who wears the main body and configured to detect a motion of the head portion of the user. The presentation unit is disposed on the main body and is capable of presenting information switched on the basis of an output from the detection unit to the user. | 01-29-2015 |
20150029092 | SYSTEMS AND METHODS OF INTERPRETING COMPLEX GESTURES - The technology disclosed relates to using a curvilinear gestural path of a control object as a gesture-based input command for a motion-sensing system. In particular, the curvilinear gestural path can be broken down into curve segments, and each curve segment can be mapped to a recorded gesture primitive. Further, certain sequences of gesture primitives can be used to identify the original curvilinear gesture. | 01-29-2015 |
20150029093 | Motion-Based View Scrolling with Proportional and Dynamic Modes - The present invention provides a system and methods for motion-based scrolling of a relatively large contents view on an electronic device with a relatively small screen display. The user controls the scrolling by changing the device's tilt relative to a baseline tilt. The scrolling control can follow a Proportional Scroll mode, where the relative tilt directly controls the screen position over the contents view, or a Dynamic Scroll mode where the relative tilt controls the scrolling speed. The present invention obtains a criterion for automatically selecting the best scrolling mode when the dimensions of the contents view change. The baseline tilt is updated when the screen reaches an edge of the contents view to eliminate the creation of a non responsive range of tilt changes when the user changes tilt direction. | 01-29-2015 |
20150029094 | USER INTERFACE WITH LOCATION MAPPING - A system includes a processing logic and output logic. The processing logic is configured to receive information identifying points of interest, identify at least one of location or distance information from each of the points of interest to the system, and sort the points of interest based on the location or distance information. The output logic is configured to project information identifying the points of interest onto a surface, wherein the projected information is displayed in accordance with the sorting. | 01-29-2015 |
20150029095 | COMMAND OF A DEVICE BY GESTURE EMULATION OF TOUCH GESTURES - A user to machine interface emulates a touch interface on a screen. The interface is configured for operating in a touch emulation mode based on a triggering event. A triggering event may be a rotation around a first axis of an angle higher than a first threshold. Analysis of the amount of rotation around a second axis may be used to determine the number of fingers defining a specific touch gesture. An infinite variety of touch gestures may therefore be emulated by a remote control based on application context thus allowing for multiple uses of the touch screen machine from a distance. | 01-29-2015 |
20150029096 | IMAGE DISPLAY DEVICE - There is provided an image display device which is kept from wastefully consuming electric power while an eyelid is closed by blinks in a waking state. A signal from an FIR sensor ( | 01-29-2015 |
20150029097 | SCENARIO-SPECIFIC BODY-PART TRACKING - A human subject is tracked within a scene of an observed depth image supplied to a general-purpose body-part tracker. The general-purpose body-part tracker is retrained for a specific scenario. The general-purpose body-part tracker was previously trained using supervised machine learning to identify one or more general-purpose parameters to be used by the general-purpose body-part tracker to track a human subject. During a retraining phase, scenario data is received that represents a human training-subject performing an action specific to a particular scenario. One or more special-purpose parameters are identified from the processed scenario data. The special-purpose parameters are selectively used to augment or replace one or more general-purpose parameters if the general-purpose body-part tracker is used to track a human subject performing the action specific to the particular scenario. | 01-29-2015 |
20150035743 | Wrist Worn Platform for Sensors - Methods and apparatuses for sensors are disclosed. In one example, a sensor system includes a wrist worn apparatus and a plurality of sensors. The wrist worn apparatus includes a communications interface, a user interface, a processor, and a memory including an application to receive a sensor data. The plurality of sensors are configured to send sensor data to the wrist worn apparatus. In one example, each sensor of the plurality of sensors is configured to be worn on a user finger. | 02-05-2015 |
20150035744 | NEAR-EYE OPTIC POSITIONING IN DISPLAY DEVICES - Embodiments are disclosed for adjusting alignment of a near-eye optic of a see-through head-mounted display system. In one embodiment, a method of detecting eye location for a head-mounted display system includes directing positioning light to an eye of a user and detecting the positioning light reflected from the eye of the user. The method further includes determining a distance between the eye and a near-eye optic of the head-mounted display system based on attributes of the detected positioning light, and providing feedback for adjusting the distance between the eye and the near-eye optic. | 02-05-2015 |
20150035745 | HEAD-MOUNT EYE TRACKING SYSTEM - A head-mount eye tracking system including a first light source, a first pupil image capturing device, a first and a second environmental image capturing devices, a fixing device and an image identification system. The first light source is applied to illuminate a first eye of a user. The first pupil image capturing device is applied to capture a first pupil image of the first eye. The first and the second environmental image capturing devices are respectively applied to capture a first and a second environmental images in front of the user. The fixing device is mounted to the head of the user to fix the first light source, the first pupil image capturing device, the first and the second environmental image capturing devices on the head of the user. The image identification system is applied to map the first or the second environmental images to the first pupil image. | 02-05-2015 |
20150035746 | User Interface Device - A user interface device ( | 02-05-2015 |
20150035747 | OPERATING DEVICE AND IMAGE PROCESSING APPARATUS - Disclosed is an operating device including: an operating unit having a plurality of operation buttons; a gaze detection unit configured to detect a gaze of an operator who operates the operating unit; and a judgment unit configured to judge whether an operation for a predetermined operation button of the operating unit is valid or invalid when the operation is received from the operator, wherein in case that the gaze of the operator is not within a predetermined area relating to the predetermined operation button when the operation for the predetermined operation button is received from the operator, the judgment unit judges that the operation is invalid. | 02-05-2015 |
20150035748 | METHOD OF INPUTTING USER INPUT BY USING MOBILE DEVICE, AND MOBILE DEVICE USING THE METHOD - Provided are a method of inputting a user input by using a mobile device and a mobile device using the method. The method of performing operations of the mobile device according to a plurality of input modes, the method being performed by the mobile device, includes operations of, determining whether the mobile device is placed on a surface, if the determining of whether the mobile device is placed on the surface indicates that the mobile device is placed on the surface, changing an input mode of the mobile device, and performing a preset operation that corresponds to the changed input mode. | 02-05-2015 |
20150042552 | DEVICE HAVING GAZE DETECTION CAPABILITIES AND A METHOD FOR USING SAME - An electronic device comprising: a display; one or more gaze detection sensors for determining a portion of the display to which a user's gaze is currently directed to; a timer to measure periods of time associated with the user's current gaze at the display; and one or more processors operative to: receive data relating to periods of time measured by the timer and determine therefrom a characteristic rate at which the user shifts his gaze from one portion of the display to another; determine a portion of the display towards which the user's gaze was directed for a period of time longer than a period of time which is expected in accordance with his characteristic rate; identify an object included in the determined portion of the display; retrieve information that relates to the identified object; and enable displaying information which is based on the retrieved information. | 02-12-2015 |
20150042553 | DYNAMIC GPU FEATURE ADJUSTMENT BASED ON USER-OBSERVED SCREEN AREA - An aspect of the present invention proposes a solution to allow a dynamic adjustment of a performance level of a GPU based on the user observed screen area. According to one embodiment, a user's focus in one or more display panels is determined. The GPU that performs rendering for that region and/or display panel will dynamically adjust (i.e., increase) the level of performance in response to the user's focus, whereas all other GPUs (e.g., the GPUs that perform rendering for other regions/display panels) will experience a reduced level of performance. According to such an embodiment, dynamically reducing the performance of GPUs outside of the area of focus can result in any one or more of a significant number of benefits, including lower power consumption rates, less processing, less (frequent) memory accesses, and reduced heat and noise levels | 02-12-2015 |
20150042554 | METHOD FOR ADJUSTING SCREEN DISPLAYING MODE AND ELECTRONIC DEVICE - A method for adjusting screen displaying mode and an electronic device suitable for the method are provided; the electronic device has a body sensor. The method includes: determining whether the body sensor has detected a body contact; when the body contact is detected by the body sensor, not adjusting a screen displaying mode of the electronic device; when the body contact is not detected by the body sensor, determining whether to adjust the screen displaying mode of the electronic device according to a tilt status of the electronic device. | 02-12-2015 |
20150042555 | Method and Apparatus for Communication Between Humans and Devices - This invention relates to methods and apparatus for improving communications between humans and devices. The invention provides a method of modulating operation of a device, comprising: providing an attentive user interface for obtaining information about an attentive state of a user; and modulating operation of a device on the basis of the obtained information, wherein the operation that is modulated is initiated by the device. Preferably, the information about the user's attentive state is eye contact of the user with the device that is sensed by the attentive user interface. | 02-12-2015 |
20150042556 | METHOD FOR CONTROLLING ROTATION OF SCREEN PICTURE OF TERMINAL, AND TERMINAL - Embodiments of the present invention disclose a method for controlling display interface rotation of a terminal, including: collecting human face image data of a user; processing the human face image data of the user to acquire human face posture information; and rotating a display interface according to the human face posture information to adjust a direction of the display interface of the terminal. The embodiments of the present invention further disclose a terminal. According to the present invention, a display interface of the terminal may remain in an orthophoric direction of a user, which can improve the visual experience of the user and make the terminal more intelligent. | 02-12-2015 |
20150042557 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM - There is provided an information processing apparatus including a viewpoint position determination unit that determines, based on acquired viewpoint position information regarding a viewpoint position of a user, whether the viewpoint position of the user is included in a viewpoint position range suitable for content, and an object display control unit that, if the viewpoint position of the user is not included in the viewpoint position range suitable for the content, performs display control for displaying a viewpoint guidance object that guides the viewpoint of the user to the viewpoint position range suitable for the content. | 02-12-2015 |
20150042558 | METHOD FOR DETERMINING THE DIRECTION IN WHICH A USER IS LOOKING - The present invention relates to a method for determining the direction in which a user is looking, which includes acquiring images of the eye, in particular by means of an optical sensor, the method including: a) a first processing of the images, yielding information on the orientation of the eye according to the observation of an area of the eye in which the aspect varies with the rotation of the eye; and b) a second processing of the images, yielding information on the kinematics of the eye by comparing at least two consecutive images; method in which information is generated relating to the direction in which the user is looking relative to the head of the user, at least according to the information supplied by the first and second processes. | 02-12-2015 |
20150042559 | Information Processing Method And Electronic Device Thereof, Image Calibration Method And Apparatus, And Electronic Device Thereof - An information processing method applied to an electronic device is provided. The electronic device can make first multimedia data displayed synchronously on a first display unit and a second display unit. The method includes: acquiring a first parameter of the first display unit and a second parameter of the second display unit; acquiring a first operation on the first multimedia data for the second display unit; analyzing the first operation to obtain first coordinates of the first operation; transforming, based on the first parameter and the second parameter, the first coordinates of the first operation into second coordinates of the first operation on the first multimedia data for the first display unit; and performing the first operation based on the second coordinates of the first operation. | 02-12-2015 |
20150042560 | Electronic Device - An electronic device is provided in the disclosure. The electronic device includes a body, a display unit, and a projecting unit; where the body comprises a first surface and a second surface which intersects the first surface, where the second surface supports the body in a standing position on a support surface at a first angle which is not zero degrees between the second surface and the support surface; the display unit is disposed on the first surface and displays content; and the projecting unit is supported by the body and projects content externally when the body stands on the support surface. | 02-12-2015 |
20150042561 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND RECORDING MEDIUM - An information processing device includes: an image output unit which outputs a screen displayed on a display unit to an image display device; a display control unit which displays a first screen including a user interface screen at least in a part of the screen; a detection unit which detects that the user interface screen is switched from a first state that does not include predetermined information to a second state that includes the information; and an output control unit which outputs a second screen that does not include the user interface screen, instead of the first screen displayed by the display control unit, to the image output unit, if it is detected that the user interface screen is switched from the first state to the second state. | 02-12-2015 |
20150042562 | Image Resizing For Optical Character Recognition In Portable Reading Machine - A reading machine that operates in various modes includes image correction processing is described. The reading device pre-processes an image for optical character recognition by receiving the image and determining whether text in the image is too large or small for optical character recognition processing by determining that text height falls outside of a range in which optical character recognition software will recognize text in a digitized image. If necessary the image is resized according to whether the text is too large or too small. | 02-12-2015 |
20150049009 | SYSTEM-WIDE HANDWRITTEN NOTES - An embodiment provides a method, including: ascertaining user input to a display screen forming a predetermined shape associated with system-wide note taking; determining, using one or more processors, user input note data associated with the predetermined shape; and providing, in a predetermined location, a note including the user input note data. Other aspects are described and claimed. | 02-19-2015 |
20150049010 | ORGANIZING DISPLAY DATA ON A MULTIUSER DISPLAY - For organizing display data on a multiuser display, a position module determines a user position from an audible signal. An organization module organizes display data on a multiuser display in response to the user position. | 02-19-2015 |
20150049011 | METHOD AND APPARATUS FOR ENHANCING THREE-DIMENSIONAL IMAGE PROCESSING - A system and method of controlling displayed three-dimensional visual content is disclosed. A signal is received from a user operated adjustment device, the signal being generated by a physical movement of the adjustment device by the user, wherein the signal corresponds to a desired adjustment of the displayed three-dimensional content. A value of an image-generation parameter of the three-dimensional visual content is adjusted corresponding to the signal received from the adjustment device. A physical parameter of a display device that displays the three-dimensional visual content is changed based on the adjusted value of the image-generation parameter in order to implement the desired adjustment of the displayed three-dimensional visual content. | 02-19-2015 |
20150049012 | VISUAL, AUDIBLE, AND/OR HAPTIC FEEDBACK FOR OPTICAL SEE-THROUGH HEAD MOUNTED DISPLAY WITH USER INTERACTION TRACKING - A method, an apparatus, and a computer program product provide feedback to a user of an augmented reality (AR) device having an optical see-through head mounted display (HMD). The apparatus obtains a location on the HMD corresponding to a user interaction with an object displayed on the HMD. The object may be an icon on the HMD and the user interaction may be an attempt by the user to select the icon through an eye gaze or gesture. The apparatus determines whether a spatial relationship between the location of user interaction and the object satisfies a criterion, and outputs a sensory indication, e.g., visual display, sound, vibration, when the criterion is satisfied. The apparatus may be configured to output a sensory indication when user interaction is successful, e.g., the icon was selected. Alternatively, the apparatus may be configured to output a sensory indication when the user interaction fails. | 02-19-2015 |
20150049013 | AUTOMATIC CALIBRATION OF EYE TRACKING FOR OPTICAL SEE-THROUGH HEAD MOUNTED DISPLAY - An apparatus for calibrating an eye tracking system of a head mounted display displays a moving object in a scene visible through the head mounted display. The object is displayed progressively at a plurality of different points (P) at corresponding different times (T). While the object is at a first point of the plurality of different points in time, the apparatus determines whether an offset between the point P and an eye gaze point (E) satisfies a threshold. The eye-gaze point (E) corresponds to a point where a user is determined to be gazing by the eye tracking system. If the threshold is not satisfied, the apparatus performs a calibration of the eye tracking system when the object is at a second point of the plurality of different points in time. The apparatus then repeats the determining step when the object is at a third point of the plurality of different points in time. | 02-19-2015 |
20150049014 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND COMPUTER-READABLE MEDIUM - An information processing apparatus includes a line-of-sight detection unit, a display determination unit, and a first display switching unit. The line-of-sight detection unit detects a line of sight of an operator toward a display apparatus. The display determination unit determines whether or not secret information which is information to be concealed from a person other than the operator is displayed on the display apparatus. The first display switching unit switches information displayed on the display apparatus to another image on the basis of the determination result obtained by the display determination unit when the line-of-sight detection unit does not detect the line of sight. | 02-19-2015 |
20150049015 | DISPLAY DEVICE, IN PARTICULAR FOR MOTOR VEHICLE - A display device, in particular for motor vehicle, includes a projection module and an optical path. The optical path has at least one reflection element, which is designed to reflect at least partially an image originating from the projection module in a normal direction of gaze of a user of the display device. A first configuration of the optical path is applied in a first mode of operation of the display device, and a second configuration of the optical path is applied in a second mode of operation of the display device. In the second mode of operation of the display device, at least a part of the image generated by the projection module is visible to the user of the display device in a magnified manner with respect to the first mode of operation of the display device. | 02-19-2015 |
20150049016 | MULTIMODAL SYSTEM AND METHOD FACILITATING GESTURE CREATION THROUGH SCALAR AND VECTOR DATA - A device and a method facilitating generation of one or more intuitive gesture sets for the interpretation of a specific purpose are disclosed. Data is captured in a scalar and a vector form which is further fused and stored. The intuitive gesture sets generated after the fusion are further used by one or more components/devices/modules for one or more specific purpose. Also incorporated is a system for playing a game. The system receives one or more actions in a scalar and a vector from one or more user in order to map the action with at least one pre stored gesture to identify a user in control amongst a plurality of users and interpret the action of user for playing the game. In accordance with the interpretation, an act is generated by the one or more component of the system for playing the game. | 02-19-2015 |
20150049017 | GESTURE RECOGNITION FOR DEVICE INPUT - A user can make a symbol with their hand, or other such gesture, at a distance from a computing device that can be captured by at least one imaging element of the device. The captured information can be analyzed to attempt to determine the location of distinguishing features of the symbol in the image information. The image information is then compared to hand gesture information stored in, for example, a library of hand gestures for the user. Upon identifying a match, an input to an application executing on the computing device is provided when the image information contains information matching at least one hand gesture with at least a minimum level of certainty. The hand gesture could include a single “static” gesture, such as a specific letter in sign language, for example, or include two or more “static” gestures. The gesture could also include motion, such as hand movement. | 02-19-2015 |
20150049018 | Virtual Window in Head-Mounted Display - Methods and systems involving a virtual window in a head-mounted display (HMD) are disclosed herein. An exemplary system may be configured to: (i) receive head-movement data that is indicative of head movement; (ii) cause an HMD to operate in a first mode in which the HMD is configured to: (a) simultaneously provide a virtual window and a physical-world view in the HMD; (b) display, in the virtual window, a portion of a media item that corresponds to a field of view; (c) determine movement of the field of view; and (d) update the portion of the media item that is displayed in the virtual window; (iii) receive mode-switching input data and responsively cause the HMD to switch between the first mode and a second mode; and (iv) responsive to the mode-switching input data, cause the HMD to operate in the second mode. | 02-19-2015 |
20150054726 | Modifying Information Presented by an Augmented Reality Device - An approach is provided to control information display at an augmented reality device. In the approach, a biometric value is received from a biometric input device. The biometric input device is a device that receives biometric data from a user of the augmented reality device. The received biometric value is compared to a number of previously established biometric input ranges that correspond to the user. Each of the biometric input ranges corresponds to a different display policy. The comparison identifies a selected display policy. The display detail of the augmented reality device is then automatically set according to the selected display policy. | 02-26-2015 |
20150054727 | HAPTICALLY ENABLED VIEWING OF SPORTING EVENTS - A system that generates haptic effects for a sporting event receives sporting event data that includes different types of event data, each type having a corresponding characteristic. The system assigns a different type of haptic effect to each different type of event data, and generates a haptic signal that corresponds to each type of haptic effect. The system then transmits the haptic signal to a haptic output device. | 02-26-2015 |
20150054728 | BIOSIGNAL INTERFACE APPARATUS AND OPERATION METHOD OF BIOSIGNAL INTERFACE APPARATUS - A biosignal interface apparatus includes a sensor configured to detect a target in contact with the sensor, a position identifier configured to identify a position of the sensor on the target, and a controller configured to control an operation mode of the sensor based on the identified position. | 02-26-2015 |
20150054729 | REMOTE DEVICES USED IN A MARKERLESS INSTALLATION OF A SPATIAL OPERATING ENVIRONMENT INCORPORATING GESTURAL CONTROL - Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal. | 02-26-2015 |
20150054730 | WRISTBAND TYPE INFORMATION PROCESSING APPARATUS AND STORAGE MEDIUM - There is provided a wristband type information processing apparatus including a band unit configured to be worn on a wrist of a user, a projection unit configured to project an image on a hand wearing the band unit, and a control unit configured to control the projection unit in a manner that at least one of an image that is stored in a storage unit and an image that is input from an external apparatus is projected on the hand. | 02-26-2015 |
20150054731 | APPARATUS AND METHOD FOR PROVIDING INFORMATION BY RECOGNIZING USER'S INTENTIONS - A mobile device and operating method provides information by automatically recognizing a user intention. The mobile device includes a sensor configured to sense a user's eye movement, an event detector configured to detect an event by inferring a user intention based on the user's eye movement, an application function executor configured to execute a function of an application in response to the detected event. The mobile device may further include a display configured to display results of function execution. | 02-26-2015 |
20150054732 | Controlling Marine Electronics Device - Various implementations described herein are directed to a non-transitory computer readable medium having stored thereon computer-executable instructions which, when executed by a computer, may cause the computer to receive motion data or button input recorded by one or more motion sensors or one or more buttons on a wearable device. The computer may determine that the motion data or button input corresponds to a command for operating a marine electronics device. The computer may perform an action corresponding to the command on the marine electronics device. | 02-26-2015 |
20150054733 | MULTIFUNCTION BUTTON - Aspects of the present invention enable a storage device (e.g., a recordable book, toy, computing device) to be controlled with a single control button that performs multiple functions. Different interactions with the button produce a different control input. In one aspect, the storage device is an audio recording device that can record, lock, unlock, transfer to a separate storage device, or play back one or more audio recordings. These five functions (i.e., recording, locking, unlocking, transferring, and playing back) are initiated or facilitated by depressing a single button located on the audio storage device for different lengths of time or in different patterns. Audio recordings may be played in response to user interactions with the button to help the user interact with the button properly and warn the user of action that is about to be taken. | 02-26-2015 |
20150054734 | HEAD-MOUNTABLE APPARATUS AND SYSTEMS - A head mountable display (HMD) comprises an infrared light source operable to illuminate foreground objects but not background objects greater than a threshold distance from the HMD; one or more cameras operable to capture infrared illuminated images and visible light illuminated images; and an image processor operable to detect, from the infrared illuminated images, foreground objects in the visible light illuminated images. | 02-26-2015 |
20150054735 | INFORMATION PROCESSING APPARATUS, METHOD FOR CONTROLLING INFORMATION PROCESSING APPARATUS, AND STORAGE MEDIUM - An information processing apparatus configured to recognize a touch of a recognition object onto an operation surface based on a proximity state between the operation surface and the recognition object, includes a position detection unit configured to detect an instruction position indicated by the recognition object, and an identification unit configured to identify a position at which the instruction position detected by the position detection unit is estimated to stop moving while the operation surface and the recognition object are located closer to each other than a predetermined distance, as a position touched by the recognition object. | 02-26-2015 |
20150054736 | Modifying Information Presented by an Augmented Reality Device - An approach is provided to control information display at an augmented reality device. In the approach, a biometric value is received from a biometric input device. The biometric input device is a device that receives biometric data from a user of the augmented reality device. The received biometric value is compared to a number of previously established biometric input ranges that correspond to the user. Each of the biometric input ranges corresponds to a different display policy. The comparison identifies a selected display policy. The display detail of the augmented reality device is then automatically set according to the selected display policy. | 02-26-2015 |
20150054737 | Instruction Triggering Method and Device, User Information Acquisition Method and System, Terminal, and Server - The present disclosure provides a method and apparatus for triggering an instruction, methods and systems for obtaining user information, a terminal, and a server. The instruction triggering method includes: detecting a shaking operation of a mobile terminal; and triggering a preset input instruction according to the detected shaking operation. The user information obtaining method includes: when a mobile terminal detects a shaking operation, the mobile terminal sending a user information obtaining request to a server; and the mobile terminal receiving user information of a matching user returned according to the user information by the server. Another user information obtaining method includes: after a server receives a user information obtaining request triggered by a shaking operation of a mobile terminal, obtaining a user who matches the user information obtaining request and sending user information of the matching user to the mobile terminal. The present disclosure improves the convenience of operations and provides a widely used experience of randomly making friends. | 02-26-2015 |
20150054738 | DISPLAY APPARATUS AND CONTROL METHOD - A display apparatus and a control method capable of preventing its user from viewing an image in an improper viewing position are provided. The display apparatus includes: an imaging unit that captures a moving image in a predetermined range with respect to an image display direction; an image analyzer that analyzes the moving image captured by the imaging unit, and calculates a position of a target that should be guided to a proper viewing position; and a display controller that causes a display unit to perform display to guide the target to the proper viewing position when the target position calculated by the image analyzer is at an improper viewing position. | 02-26-2015 |
20150054739 | DISPLAY DIRECTION CONTROL FOR DIRECTIONAL DISPLAY DEVICE - An information processing device connected to a display device capable of varying display contents according to display directions, the information processing device including: a position information acquiring unit to acquire position information on an object within a range from which the display device is viewable; a display direction determining unit to determine, based on the position information, a first display direction in which a first partial area of the display device is viewable from a position of the object and a second display direction in which a second partial area of the display device is viewable from the position of the object; and a display control unit that causes the display device to display contents so that the contents are displayed in the first display direction in the first partial area and the contents are displayed in the second display direction in the second partial area. | 02-26-2015 |
20150061989 | ATTENTION-BASED RENDERING AND FIDELITY - Methods and systems for attention-based rendering on an entertainment system are provided. A tracking device captures data associated with a user, which is used to determine that a user has reacted (e.g., visually or emotionally) to a particular part of the screen. The processing power is increased in this part of the screen, which increases detail and fidelity of the graphics and/or updating speed. The processing power in the areas of the screen that the user is not paying attention to is decreased and diverted from those areas, resulting in decreased detail and fidelity of the graphics and/or decreased updating speed. | 03-05-2015 |
20150061990 | OPHTHALMIC LENS SYSTEM CAPABLE OF INTERFACING WITH AN EXTERNAL DEVICE - The present invention provides an energizable ophthalmic lens system capable of wirelessly interfacing with an external device. The energizable ophthalmic lens system may dynamically interact with a specified external device, wherein a user may operate one or more functionalities within the external device through the energizable ophthalmic lens system. The external device may be able to recognize eye gestures, which may comprise deliberate eye and lid movements. The external device may operate a functionality within the ophthalmic lens system, wherein the operation may be based on information received from the ophthalmic lens system. The ophthalmic lens system may comprise at least one energizable ophthalmic lens. Multiple ophthalmic lenses may be preferable where the functionality of either or both the ophthalmic lens system or the external device may occur based on relative position data or communication between lenses. | 03-05-2015 |
20150061991 | METHOD AND APPARATUS FOR MANIPULATING CONTENT IN AN INTERFACE - A machine implemented method includes sensing entities in first and second domains. If a first stimulus is present and an entity is in the first domain, the entity is transferred from first to second domain via a bridge. If a second stimulus is present and an entity is in the second domain, the entity is transferred from second first domain via the bridge. At least some of the first domain is outputted. An apparatus includes a processor that defines first and second domains and a bridge that enables transfer of entities between domains, an entity identifier that identifies entities in the domains, a stimulus identifier that identifies stimuli, and a display that outputs at least some of the first domain. The processor transfers entities from first to second domain responsive to a first stimulus, and transfers entities from second to first domain responsive to a second stimulus. | 03-05-2015 |
20150061992 | METHOD AND APPARATUS FOR MANIPULATING CONTENT IN AN INTERFACE - A machine implemented method includes sensing entities in first and second domains. If a first stimulus is present and an entity is in the first domain, the entity is transferred from first to second domain via a bridge. If a second stimulus is present and an entity is in the second domain, the entity is transferred from second first domain via the bridge. At least some of the first domain is outputted. An apparatus includes a processor that defines first and second domains and a bridge that enables transfer of entities between domains, an entity identifier that identifies entities in the domains, a stimulus identifier that identifies stimuli, and a display that outputs at least some of the first domain. The processor transfers entities from first to second domain responsive to a first stimulus, and transfers entities from second to first domain responsive to a second stimulus. | 03-05-2015 |
20150061993 | TERMINAL APPARATUS, DISPLAY METHOD, RECORDING MEDIUM, AND DISPLAY SYSTEM - A terminal apparatus provides a highly convenient user interface. The terminal apparatus includes a designated direction detection unit and a display control unit. The designated direction detection unit detects a designated direction, which is a direction in which the terminal apparatus is directed, with respect to a reference direction as a reference of the direction of the terminal apparatus on the basis of a signal from a sensor that outputs a signal indicating the attitude of the terminal apparatus. The display control unit displays an image, in which the direction in which the object is directed with respect to the reference direction is maintained and the object is disposed at the end of a direction corresponding to the designated direction, on a display device. | 03-05-2015 |
20150061994 | GESTURE RECOGNITION METHOD AND WEARABLE APPARATUS - A wearable apparatus includes a user interface, a motion sensor, a microprocessor and a central processing unit (CPU). In an operation mode, the motion sensor senses a current hand movement trajectory (HMT). The microprocessor generates a velocity curve along a coordinate axis according to the current HMT, and samples the velocity curve according to a first predetermined velocity and a second predetermined velocity to output velocity sampling points. The microprocessor further determines whether a matching number between the velocity sampling points and velocity feature points is greater than a threshold. The current HMT matches a predetermined HMT when the matching number is greater than the threshold. The CPU performs a system operation corresponding to the default HMT when the current HMT matches the predetermined HMT. | 03-05-2015 |
20150061995 | PORTABLE EYE TRACKING DEVICE - A portable eye tracker device is disclosed which includes a frame, at least one optics holding member, a movement sensor, and a control unit. The frame may be a frame adapted for wearing by a user. The at least one optics holding member may include at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user, and at least one image sensor configured to capture image data representing images of at least a portion of at least one eye of the user. The movement sensor may be configured to detect movement of the frame. The control unit may be configured to control the at least one illuminator for the selective illumination of at least a portion of at least one eye of the user, receive the image data from the image sensors, and receive information from the movement sensor. | 03-05-2015 |
20150061996 | PORTABLE EYE TRACKING DEVICE - A portable eye tracker device is disclosed which includes a frame, at least one optics holding member, and a control unit. The frame may be adapted for wearing by a user and include an eyeglass frame having a nose bridge. The at least one optics holding member may include at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user and at least one image sensor configured to capture image data representing images of at least a portion of at least one eye of the user. The optics holding member may be coupled with the nose bridge. The control unit may be configured to control the at least one illuminator for the selective illumination of at least a portion of at least one eye of the user, and receive the image data from the at least one image sensor. | 03-05-2015 |
20150061997 | WEARABLE WATCH-TYPE TERMINAL AND SYSTEM EQUIPPED WITH THE SAME - A mobile terminal including a main body configured to be worn on a user's wrist; a wireless communication unit configured to wirelessly communicate with a glasses-type terminal worn by the user, said glasses-type terminal including a camera configured to capture an image including operational data for operating an external apparatus; and a controller configured to receive including the operational data from the glasses-type terminal, store the operational data in a memory associated with the mobile terminal, and transmit a control signal to the external apparatus to control the external apparatus according to the stored operational data. | 03-05-2015 |
20150061998 | APPARATUS AND METHOD FOR DESIGNING DISPLAY FOR USER INTERACTION - The present invention relates to an apparatus and method for designing a display for user interaction. The proposed apparatus includes an input unit for receiving physical information of a user and a condition depending on a working environment. A space selection unit selects an optimal near-body work space corresponding to the condition received by the input unit. A space search unit calculates an overlapping area between a viewing frustum space, defined by a relationship between a gaze of the user and an optical system of a display enabling a 3D image to be displayed, and the optimal near-body work space selected by the space selection unit. A location selection unit selects a location of a virtual screen based on results of calculation. An optical system production unit produces an optical system in which the virtual screen is located at the location selected by the location selection unit. | 03-05-2015 |
20150061999 | WEARABLE DISPLAY AND METHOD OF CONTROLLING THEREFOR - The present specification relates to a wearable display and a method of controlling therefor, and more particularly, to a method of updating information displayed in the wearable display by recognizing opening and closing of eyes of a user wearing the wearable display. | 03-05-2015 |
20150062000 | HEAD MOUNTED DISPLAY APPARATUS - When a hand of the user is recognized in an image pickup region of a camera, a head mounted display monitors behavior of the hand in the image pickup region. When the hand of the user in the image pickup region reaches an outer peripheral region forming an outer periphery of the image pickup region, a notification is give to the user. | 03-05-2015 |
20150062001 | ELECTRONIC DEVICE AND METHOD OF PROCESSING USER INPUT BY ELECTRONIC DEVICE - A method of processing a user input by an electronic device is provided. The method includes receiving a signal from an auxiliary electronic device, providing application information related to the auxiliary electronic device based on the signal, and identifying a program configured to receive a control signal transmitted from the auxiliary electronic device based on the application information. | 03-05-2015 |
20150062002 | METHOD AND APPARATUS FOR CONTROLLING SCREEN OF MOBILE DEVICE - A method and an apparatus for controlling a screen of a mobile device are provided. The method includes detecting an object from an image captured through an image sensor, detecting movement of a location of the object to move a location of the screen, and moving and displaying the screen according to the detected movement of the location of the object. According to the present disclosure, a screen can be moved and displayed according to movement of an object detected from an image captured by an image sensor. Accordingly, users can easily operate mobile devices of various sizes only with one hand. | 03-05-2015 |
20150062003 | Method and System Enabling Natural User Interface Gestures with User Wearable Glasses - User wearable eye glasses include a pair of two-dimensional cameras that optically acquire information for user gestures made with an unadorned user object in an interaction zone responsive to viewing displayed imagery, with which the user can interact. Glasses systems intelligently signal process and map acquired optical information to rapidly ascertain a sparse (x,y,z) set of locations adequate to identify user gestures. The displayed imagery can be created by glasses systems and presented with a virtual on-glasses display, or can be created and/or viewed off-glasses. In some embodiments the user can see local views directly, but augmented with imagery showing internet provided tags identifying and/or providing information as to viewed objects. On-glasses systems can communicate wirelessly with cloud servers and with off-glasses systems that the user can carry in a pocket or purse. | 03-05-2015 |
20150062004 | Method and System Enabling Natural User Interface Gestures with an Electronic System - An electronic device coupleable to a display screen includes a camera system that acquires optical data of a user comfortably gesturing in a user-customizable interaction zone having a z | 03-05-2015 |
20150062005 | METHOD AND SYSTEM FOR PROVIDING USER INTERACTION WHEN CAPTURING CONTENT IN AN ELECTRONIC DEVICE - A method and system for executing an operation in an electronic device using a camera are provided. Each finger action is associated with an operation. Further, the operation is classified as a processing operation that is executed when capturing content, or a post-processing operation that is executed after capturing the content. The method executes the operation based on the fingerprint of the user. The electronic device comprises a fingerprint reader to read the fingerprint of the user and can be present external to a screen of the electronic device or can be integrated within the screen of the electronic device. | 03-05-2015 |
20150062006 | FEATURE TRACKING FOR DEVICE INPUT - A user can emulate touch screen events with motions and gestures that the user performs at a distance from a computing device. A user can utilize specific gestures, such as a pinch gesture, to designate portions of motion that are to be interpreted as input, to differentiate from other portions of the motion. A user can then perform actions such as text input by performing motions with the pinch gesture that correspond to words or other selections recognized by a text input program. A camera-based detection approach can be used to recognize the location of features performing the motions and gestures, such as a hand, finger, and/or thumb of the user. | 03-05-2015 |
20150062007 | PRIORITY CONTROL FOR DIRECTIONAL DISPLAY DEVICE - An information processing device includes: a unit to acquire position information on a first user and a second user; a display control unit to cause a display device capable of varying display contents according to display directions to display first contents so as to be viewable from a position of the first user and to display second contents so as to be viewable from a position of the second user; a unit to detect that the first user and the second user are in a predetermined positional relationship; and a unit to judge whether or not a user has viewing authority with respect to contents, wherein when the predetermined positional relationship is detected, the display control unit causes the display device to stop display of contents, for which at least one of the first user and the second user does not have viewing authority. | 03-05-2015 |
20150062008 | METHOD AND SYSTEM FOR ACHIEVING MOVING SYNCHRONIZATION IN REMOTE CONTROL AND COMPUTER STORAGE MEDIUM - The present disclosure relates to a method for achieving moving synchronization in remote control, which includes: obtaining a remote control instruction, and extracting a target speed and a remote moment from the remote control instruction; obtaining a local moving state of a controlled object, the local moving state including a local speed and a local moment of the controlled object; calculating a delay time according to the remote control moment and the local moment; and moving the controlled object according to the delay time, the target speed, the local speed and a preset synchronization time. In addition, also provided is a system for achieving moving synchronization in remote control and a computer storage medium. The aforementioned method and system for achieving moving synchronization in remote control and the computer storage medium enable the moving synchronization effect to be smoother. | 03-05-2015 |
20150062009 | INFORMATION PROCESSING APPARATUS, METHOD, AND PROGRAM - An information processing apparatus includes a designation unit which designates an object, a sensing unit which senses a change which exceeds a threshold value in at least a part of the object when a user zooms in on the object, and a zoom unit which zooms in on the object at the zoom rate which is just less than that when there is a change that exceeds the threshold value in at least a part of the object if there is a change that exceeds the threshold value in at least a part of the object when the user zooms in on the object at the designated zoom rate, and which zooms in on the object if there is no change that exceeds the threshold value in at least a part of the object even when the user zooms in on the object at the designated zoom rate. | 03-05-2015 |
20150070260 | Haptic Conversion System Using Segmenting and Combining - A system is provided that converts an input into one or more haptic effects using segmenting and combining. The system receives an input. The system further segments the input into a plurality of input sub-signals. The system further converts the plurality of input sub-signals into a haptic signal. The system further generates the one or more haptic effects based on the haptic signal. | 03-12-2015 |
20150070261 | Haptic Conversion System Using Frequency Shifting - A system is provided that converts an input into one or more haptic effects using frequency shifting. The system receives an input signal. The system further performs a fast Fourier transform of the input signal. The system further shifts one or more frequencies of the transformed input signal to one or more frequencies within a shift-to frequency range. The system further performs an inverse fast Fourier transform of the frequency-shifted signal, where the inversely transformed signal forms a haptic signal. The system further generates the one or more haptic effects based on the haptic signal. | 03-12-2015 |
20150070262 | CONTEXTUAL ANNOTATIONS OF A MESSAGE BASED ON USER EYE-TRACKING DATA - In one exemplary embodiment, a method includes the step of receiving eye tracking information associated with eye movement of a user of a computing system from an eye tracking system coupled to a computing system The computing system is in a messaging mode of operation and is displaying an element of a message. Based on the eye tracking information, is determined that a path associated with the eye movement associates an external object with a portion of the message. Information about the external object is automatically associated with the portion of the message. | 03-12-2015 |
20150070263 | Dynamic Displays Based On User Interaction States - A system and method enabling dynamic interaction between users and displays. Interaction states for a user are determined by tracking user motions and position within the field of view of one or more capture devices. Interaction states are defined by any number of factors, including one or more of a user's body position, body orientation. Once a user occupies an interaction state, an associated application layout is applied to a display. Application layout states may include which application objects are displayed for a given interaction state. Triggering an application state is driven by a transition event and a determination that a user occupies an interaction state. Monitoring user motion and position may be performed continuously, so that changes in interaction states can be determined and corresponding changes to application layout states can be applied to a display, thereby rendering the technology dynamic to user movement. | 03-12-2015 |
20150070264 | ANIMATED DOCUMENT USING AN INTEGRATED PROJECTOR - The embodiments presented herein describe integrating a projector in a book to display images or animations on one or more pages of the book. Specifically, the book may include a projector that is arranged in the form factor of the book. The book may also contain one or more mirrors that reflect an image from the projector onto a desired location on a page in the book. In one embodiment, the image is projected from the rear of the book onto a back side of the page that is opposite the front side of the page facing the user. So long as the material of the page is sufficiently translucent, the image projected on the back side of the page will be visible to the user looking at the front side of the page. | 03-12-2015 |
20150070265 | Systems and Methods for Visual Processing of Spectrograms to Generate Haptic Effects - Systems and methods for visual processing of spectrograms to generate haptic effects are disclosed. In one embodiment, a signal comprising at least an audio signal is received. One or more spectrograms may be generated based at least in part on the received signal. One or more haptic effects may be determined based at least in part on the spectrogram. For example, a generated spectrogram may be a two-dimensional image and this image can be analyzed to determine one or more haptic effects. Once a haptic effect has been determined, one or more haptic output signals can be generated. A generated haptic output signal may be output to one or more haptic output devices. | 03-12-2015 |
20150070266 | Gesture determination method and electronic device thereof - A gesture determination method is utilized for an electronic device. The gesture determination method includes executing a program, determining a gesture content supported by the program to obtain a supporting gesture content, detecting and determining ambient light intensity around the electronic device to generate a light intensity determination result, deciding a first sensing process or a second sensing process to perform sensing and determining a gesture for operating the program according to the operating gesture content and the light intensity determination result. | 03-12-2015 |
20150070267 | MISRECOGNITION REDUCING MOTION RECOGNITION APPARATUS AND METHOD - A misrecognition reducing motion recognition apparatus and method are provided. A photographing unit of the apparatus photographs an image within a vehicle and a controller detects a thermal change within a set region. The controller operates a corresponding device within the vehicle by recognizing a gesture motion from the photographed image in response to determining that the gesture motion exists by the detection of the thermal change. | 03-12-2015 |
20150070268 | INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - An information processing apparatus includes: a memory, and a processor coupled to the memory and configured to: identify an end position of a last line of a plurality of lines in an object for reading displayed on a display screen, the object being subjected to determination of whether the object has been finished reading by a user based on detecting movement of a gaze position of the user on a display screen, determine a display position at which movement of the gaze position that is greater than or equal to a given distance from the identified end position is detected, and display a display object at the determined display position on the display screen, the display object being destination of the gaze point after the object has been finished reading. | 03-12-2015 |
20150070269 | DYNAMIC HAPTIC CONVERSION SYSTEM - A system is provided that dynamically converts an input signal into a haptic signal. The system generates effect objects, where an effect object includes an instruction to perform a haptic conversion algorithm on the input signal to convert the input signal into an output signal, and where an order of the effect objects is defined. The system further receives the input signal. The system further applies the effect objects to the input signal in the defined order, where the output signal of an effect object forms the haptic signal. The system further sends the haptic signal to a haptic output device, where the haptic signal causes the haptic output device to output haptic effects. | 03-12-2015 |
20150070270 | SYSTEMS, ARTICLES, AND METHODS FOR ELECTROMYOGRAPHY-BASED HUMAN-ELECTRONICS INTERFACES - Human-electronics interfaces in which at least two wearable electromyography (“EMG”) devices are operated to control virtually any electronic device are described. A first wearable EMG device is worn on a first part/location of a user's body and a second wearable EMG device is worn on a second part/location of the user's body. Muscle activity is detected by the two wearable EMG devices and corresponding communication signals are transmitted to an electronic device to control functions thereof. The two wearable EMG devices may communicate with one another. This configuration enables a user to perform elaborate gestures having multiple components (e.g., “two-arm” gestures) with each wearable EMG device detecting a different component, as well as separate gestures (e.g., separate “one-arm” gestures) individually detected and processed by each wearable EMG device. | 03-12-2015 |
20150070271 | TECHNIQUES FOR ADJUSTING A POSITION OF A DISPLAY DEVICE BASED ON A POSITION OF A USER - A technique for adjusting a display angle of an electronic apparatus includes capturing, by an image capturing module (of the electronic apparatus), images of a user with respect to a display device (of the electronic apparatus) at a reference first position and a second position. A processing module (of the electronic apparatus) provides an indication when the user exceeds a predetermined threshold in moving from the first position to the second position. A controlling module (of the electronic apparatus) causes a first driving module (of the electronic apparatus) to drive rotation of a first rotating module (of the electronic apparatus) in response to the processing module indicating the displacement has exceeded the predetermined threshold. The first rotating module is coupled to a base unit portion of the electronic apparatus. | 03-12-2015 |
20150070272 | APPARATUS, METHOD AND RECORDING MEDIUM FOR CONTROLLING USER INTERFACE USING INPUT IMAGE - A method of controlling a user interface using an input image is provided. The method includes storing operation executing information of each of one or more gesture forms according to each of a plurality of functions, detecting a gesture form from the input image, and identifying the operation executing information mapped on the detected gesture form to execute an operation according to a function which is currently operated. | 03-12-2015 |
20150070273 | USER INTERFACE BASED ON OPTICAL SENSING AND TRACKING OF USER'S EYE MOVEMENT AND POSITION - Methods, systems, and devices are disclosed for optical sensing and tracking of eye movement. In one aspect, a method for tracking the movement of an eye includes emitting light toward an eye of a user using multiple light sources substantially equally spaced from a photodetector module of a device, receiving at the photodetector module at least a partial retroreflection of the light emitted by each of the multiple light sources retroreflected from the eye, and determining a positional parameter of the eye based on differential values of the at least partial retroreflections corresponding to the multiple light sources. | 03-12-2015 |
20150070274 | METHODS AND SYSTEMS FOR DETERMINING 6DOF LOCATION AND ORIENTATION OF HEAD-MOUNTED DISPLAY AND ASSOCIATED USER MOVEMENTS - The technology described herein allows for a wearable display device, such as a head-mounted display, to be tracked within a 3D space by dynamically generating 6DoF data associated with an orientation and location of the display device within the 3D space. The 6DoF data is generated dynamically, in real time, by combining of 3DoF location information and 3DoF orientation information within a user-centered coordinate system. The 3DoF location information may be retrieved from depth maps acquired from a depth sensitive device, while the 3DoF orientation information may be received from the display device equipped with orientation and motion sensors. The dynamically generated 6DoF data can be used to provide 360-degree virtual reality simulation, which may be rendered and displayed on the wearable display device. | 03-12-2015 |
20150070275 | SYSTEMS AND METHODS FOR NAVIGATING A SCENE USING DETERMINISTIC MOVEMENT OF AN ELECTRONIC DEVICE - Systems and methods are providing for scrolling the display of information based on the displacement of the electronic device. An electronic device can include a motion sensing component operative to detect movement of the electronic device (e.g., an accelerometer). The electronic device can display any suitable information, including information that is too large to display at a single instance on the display (e.g., a multi-page text document, or a large image). To view portions of the information that are not initially displayed (e.g., to scroll displayed information), the user can move the electronic device along the plane of the device. As the motion sensing component detects movement, the electronic device can scroll the displayed information to match the detected movement. In some embodiments, the electronic device can detect tilt movements and adjust the displayed information to reflect the tilted display. | 03-12-2015 |
20150070276 | Transparent Electronic Device - A method and system for displaying images on a transparent display of an electronic device. The display may include one or more display screens as well as a flexible circuit for connecting the display screens with internal circuitry of the electronic device. Furthermore, the display screens may allow for overlaying of images over real world viewable objects, as well as a visible window to be present on an otherwise opaque display screen. Additionally, the display may include active and passive display screens that may be utilized based on images to be displayed. | 03-12-2015 |
20150070277 | IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM - Provided is an image processing apparatus including a hand shape recognition unit that performs hand shape recognition on an input image to detect a position and a size of a hand with a specific shape in the input image, a determination region setting unit that sets a region in a vicinity of the hand on the input image as a determination region used to recognize a gesture performed using the hand, based on the position and the size of the hand, and a gesture recognition unit that recognizes the gesture by monitoring movement of the hand to the determination region. | 03-12-2015 |
20150077322 | TRANSLATION AND SCALE INVARIANT FEATURES FOR GESTURE RECOGNITION - Methods and apparatuses of the present disclosure are presented for recognizing a gesture of a gesture object in a plurality of recorded data objects, with the recorded data objects being recorded over time. In some embodiments, a method includes computing at least one set of gesture angles using the plurality of recorded data objects, wherein each of the gesture angles in the at least one set comprises an angle measurement between two positions of the gesture object, the two positions recorded in successive data objects in the plurality of recorded data objects, and recognizing the gesture based on the at least one set of gesture angles. In some embodiments, the method includes recognizing the gesture is based further on comparing the at least one set of gesture angles to a gesture model. | 03-19-2015 |
20150077323 | DYNAMIC OBJECT TRACKING FOR USER INTERFACES - Systems and approaches provide for user interfaces (UIs) that are based on object tracking. For example, the object may be a user's head or face. As the user moves his head or face and/or tilts a computing device, the content displayed on the computing device will adapt to the user's perspective. The content may include three-dimensional (3D) graphical elements projected onto a two-dimensional (2D) plane and/or the graphical elements can be associated with textural shading, shadowing, or reflections that change according to user or device motion to give the user the impression that the user is interacting with the graphical elements in 3D environment. To enhance the user experience, a state of motion of the device can be determined and jitter and/or latency corresponding to the rendering of content can be altered so as to minimize or decrease jitter when the device is stationary and/or to decrease or minimize latency when the device is in motion. | 03-19-2015 |
20150077324 | ORIENTATION ADJUSTABLE MULTI-CHANNEL HAPTC DEVICE - A system that generates haptic effects on a haptically-enabled device determines an orientation of the haptically-enabled device and obtains one or more haptic effect channels. The system then assigns each of the haptic effect channels to a haptic output device on the haptically-enabled device based on the orientation. | 03-19-2015 |
20150077325 | MOTION DATA BASED FOCUS STRENGTH METRIC TO FACILITATE IMAGE PROCESSING - Apparatuses, systems, media and/or methods may involve facilitating an image processing operation. User motion data may be identified when it user observes an image. A focus strength metric may be determined based on the user motion data. The focus strength metric may correspond to a focus area in the image. Also, a property of the focus strength metric may be adjusted. A peripheral area may be accounted for to determine the focus strength metric. A variation in a scan pattern may be accounted for to determine the focus strength metric. Moreover, a color may be imparted to the focus area and/or the peripheral area. In addition, a map may be formed based on the focus strength metric. The map may include a scan pattern map and a heat imp. The focus strength metric may be utilized to prioritize the focus area and/or the peripheral area in an image processing operation. | 03-19-2015 |
20150077326 | OPERATING ENVIRONMENT WITH GESTURAL CONTROL AND MULTIPLE CLIENT DEVICES, DISPLAYS, AND USERS - Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal. | 03-19-2015 |
20150077327 | INTERACTIVE VEHICLE WINDOW DISPLAY SYSTEM WITH USER IDENTIFICATION - A system for a vehicle includes a user identification subsystem operable to detect a user and an interactive display subsystem operable to generate output for display on a vehicle window in response to detection of the user. A method of operating a system for a vehicle includes detecting a user of the vehicle and generating output for display on a vehicle window visible from at least one of outside and inside the vehicle in response to detection of the user. | 03-19-2015 |
20150077328 | REMOTE CONTROL SYSTEMS AND METHODS FOR PROVIDING PAGE COMMANDS TO DIGITAL ELECTRONIC DISPLAY DEVICES - A remote control system, set forth by way of example and not limitation, includes a remote control device including one or more controls operative to develop at button control signals in response to activation by a user. An interface device is responsive to the button control signals and is operative to provide device control signals via a wired connection to an electronic display device. The device control signals are operative to control an application running on an electronic display device. | 03-19-2015 |
20150077329 | EYE TRACKING-BASED USER INTERFACE METHOD AND APPARATUS - A method includes matching a pupil center position obtained from image information taken by a camera and a center position of an UI on a display panel of a terminal, and recognizing the match as a touch on the UI when the match between the pupil center position and the center position of the UI is kept for a predetermined time or more. | 03-19-2015 |
20150077330 | OPERATION SWITCH AND OPERATION DEVICE - The operation device includes a display part; a partition frame member arranged on the display part, and configured to divide the display part into a plurality of display surfaces; an operation button member having transparent operation buttons arranged on the respective display surfaces, and configured to join the operation buttons by integral molding to be able to deform the operation buttons in a pressing direction at respective one sides, functioning as fulcrums, of the display surfaces; switch parts arranged under the operation button member, and pressed by pressing of the operation buttons; and an output part configured to output information indicating whether or not the switch parts are pressed. | 03-19-2015 |
20150077331 | DISPLAY CONTROL DEVICE, DISPLAY CONTROL METHOD, AND PROGRAM - In an illustrative embodiment, a display control device is provided. The display control device includes a display control section to control display of a virtual object according to a positional relationship between an imaging device and a display device. | 03-19-2015 |
20150077332 | DISPLAY INFORMATION COLLECTING DEVICE AND HMI SYSTEM - An HMI display terminal unit includes a motion sensor; an IEEE802.11 I/F; and a CPU that transmits, to a smartphone via the IEEE802.11 I/F, data collected from a control device and HMI display data that shapes the collected data for display, wherein the CPU determines whether an operator has approached on the basis of the detection result of the motion sensor; when an operator has approached, acquires unique information on a smartphone of the operator via the IEEE802.11 I/F and authenticates the smartphone on the basis of the acquired unique information; when authentication is successful, transmits the HMI display data and the collected data; and, when authentication fails, does not transmit the HMI display data and the collected data. | 03-19-2015 |
20150077333 | Method and Apparatus for Processing Menu Layout - The disclosure provides a method and apparatus for processing menu layout. comprising: obtaining a moving direction and moving acceleration value of user equipment; judging whether the moving acceleration value of the user equipment exceeds a preset first threshold; adjusting the menu layout of the user equipment according to the obtained moving direction of the user equipment in the case that the moving acceleration value of the user equipment exceeds the first threshold, wherein the menu layout is the arrangement of one or more icons on one or more pages. According to the method and apparatus for processing menu layout, the problem in the related art that the icons need to be dragged one by one by hand when moved which is time-consuming, cannot determine the arrangement position accurately and affects the user experience is solved, thereby improving the user experience effect. | 03-19-2015 |
20150077334 | INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - An information processing apparatus, includes: a memory, and a processor coupled to the memory and configured to: obtain a gaze position of a user on a display screen, determine whether an object for reading, displayed on the display screen, has been finished reading by the user, based on a threshold and the number of times where movement of the gaze position indicates a newline, the threshold being set in accordance with the number of lines in the object, and change at least one of the threshold and the number of times in response to detection that the movement of the gaze position becomes movement of a certain distance or greater in a second direction different from a first direction of movement indicating a newline. | 03-19-2015 |
20150077335 | INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - An information processing apparatus includes: a memory, and a processor coupled to the memory and configured to: calculate a first distance between an end of a first line in an object for reading displayed on a display screen and a beginning of a second line that is next to the first line in the object, and set, in accordance with the first distance, a condition for determining whether the object has been finished reading by a user based on detecting movement of a gaze position of the user on the display screen. | 03-19-2015 |
20150077336 | Methods and Apparatus for Using the Human Body as an Input Device - Described are apparatus and methods for reconstructing a full human skeletal pose or a partial skeletal pose by aggregating and fusing various data from various sensors, and for detecting the occurrence of gestures from poses that occur over a period of time. | 03-19-2015 |
20150077337 | SYSTEM AND METHOD FOR INTERACTIVE VISUALIZATION OF INFORMATION IN AN AIRCRAFT CABIN - An interactive aircraft cabin window display system and a method for interactive visualization of information in an aircraft cabin are described. The display system includes a display assembly DA integratable into an aircraft cabin window, a passenger monitoring assembly PMA, an environment monitoring assembly EMA, and an information visualization assembly IVA. The DA is adapted for displaying variable images on a screen such as a semi-transparent screen integrated in the aircraft cabin window. The PMA may be implemented using for example an eye-tracking camera or a touch-screen and is adapted for detecting a direction into which a passenger is looking or pointing through the aircraft cabin window. The EMA is adapted for acquiring a representation such as an image of an environment outside the aircraft cabin window. The WA is adapted for visualizing information on the screen at specific locations selected by taking into account the representation acquired by the EMA and taking into account the direction detected by the PMA. Accordingly, the interactive display system may detect to which object a passenger is currently looking or pointing and may interactively provide additional information about this object of interest, thereby improving the passenger's flight experience. | 03-19-2015 |
20150084848 | INTERACTION BETWEEN GENERIC INTERACTION DEVICES AND AN INTERACTIVE DISPLAY - Interaction techniques are described herein involving communications between an interactive display, an interactive system, and at least two generic interaction devices. The interactive system may process relative location information for the generic interaction devices, and the interactive system may cause the interactive display to depict interactions between generic, real-world interaction devices. This may allow for enhanced individual interaction between one or more physical generic interaction devices and a virtualized environment or a virtual world presented by the interactive display. This may also allow for other interactions between a set of generic interaction devices that can be interpreted and presented by the interactive display. | 03-26-2015 |
20150084849 | VEHICLE OPERATION DEVICE - A vehicle operation device includes a vibration sensing unit configured to sense vibration or rotation of a vehicle to generate vibration data; a gesture sensing unit configured to sense a user's hand to generate image data; a gesture recognition unit configured to analyze the image data to recognize a gesture and selectively compensate for the recognized gesture depending on the vibration data; and a control unit configured to perform a control operation corresponding to the gesture. | 03-26-2015 |
20150084850 | HEAD-MOUNTED DISPLAY AND METHOD OF CONTROLLING THE SAME - Disclosed herein are a head-mounted display and a method of controlling the same, more particularly, a method of performing rotation compensation on a captured image based on an angle of rotating a user wearing the head-mounted display and an angle of rotating a camera detached from the head-mounted display. | 03-26-2015 |
20150084851 | ELECTRONIC DEVICE HAVING PLURALITY OF HUMAN-MACHINE OPERATION MODULES - An electronic device having a plurality of human-machine operation modules is disclosed. The electronic device is internally provided with at least one I/O module and a plurality of human-machine operation modules disposed in the electronic device, several human-machine operation modules being connected with the I/O module respectively. | 03-26-2015 |
20150084852 | DISPLAY APPARATUS AND METHOD FOR MOTION RECOGNITION THEREOF - A display apparatus and a method for recognizing a motion thereof are provided. The method includes setting an active area in which the externally input motion is recognizable, wherein the active area corresponds to a portion on a screen of the display apparatus, changing the display apparatus from an overall area motion recognition mode to a partial area motion recognition mode configured to only recognize the externally input motion in the active area, recognizing the externally input motion in the active area, and disabling the partial area motion recognition mode in response to the recognized externally input motion being a disabling motion configured to disable the partial area motion recognition mode. | 03-26-2015 |
20150084853 | Method and System for Mapping for Movement Trajectory of Emission Light Source Application Trajectory Thereof - An objective of the present invention is to provide a method and system for mapping a motion trace of a light-emitting source to an application trace thereof. Herein, an application detection device obtains imaging information of the light-emitting source; and detects an input mode of the light-emitting source to determine an application mapping curve corresponding to the input mode; and obtains a motion trace of the light-emitting source based on the imaging information; and obtains an application trace corresponding to the motion trace based on the motion trace by means of the application mapping curve; and. outputs the application trace to an external device. Compared with the prior art, the present invention implements adaptively matching application mapping curves and obtaining application traces for different input modes of the light-emitting source, which improves user experience. | 03-26-2015 |
20150084854 | METHOD FOR OPERATING AN OPERATING DEVICE OF A MOTOR VEHICLE - An operating device of a motor vehicle has at least one display unit for displaying changeable information. Gaze detection is used to check whether the operator's gaze is directed at the at least one display unit. An input command in a first group of input commands is not executed if the operator's gaze is not directed at the at least one display unit for a certain time period during the input command. For a second group of input commands, the input command is executed irrespective of gaze detection. | 03-26-2015 |
20150084855 | MOBILE TERMINAL AND METHOD OF CONTROLLING THEREFOR - The present invention relates to a wearable mobile terminal of a glasses form enabling a user to more conveniently use the terminal and a method of controlling therefor. According to at least one of embodiments of the present invention, various functions of the wearable mobile terminal can be executed based on a simple and easy gesture of the user. | 03-26-2015 |
20150084856 | COMBINER AND OPERATION DETECTION DEVICE - A combiner that reflects a light projected by a projector toward an operator, includes: a reflection layer that reflects the light projected by the projector; an electric field generation layer that generates an electric field around the combiner; and an output layer that outputs a voltage according to change of the electric field generated by the field generation layer. | 03-26-2015 |
20150084857 | IMAGE DISPLAY DEVICE, METHOD OF CONTROLLING IMAGE DISPLAY DEVICE, COMPUTER PROGRAM, AND IMAGE DISPLAY SYSTEM - An image display device includes a generating unit configured to generate an integrated image including first display regions where at least a part of a plurality of identification images for distinguishing a plurality of external devices connected to the image display device from one another are displayed as a list and a second display region where a display image, which is an image displayed by one external device selected out of the plurality of external devices, is displayed and an image display unit configured to cause a user of the image display device to visually recognize the integrated image. | 03-26-2015 |
20150084858 | DISPLAY DEVICE, CONTENT DISPLAY METHOD, AND A NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM - A display device includes: a display unit configured to display an image; a display control unit configured to switch among a plurality of kinds of content to be displayed on the display unit; and a content evaluation unit configured to evaluate an affirmation level of content displayed on the display unit, wherein the display control unit determines content to be displayed on the display unit based on the affirmation level. | 03-26-2015 |
20150084859 | System and Method for Recognition and Response to Gesture Based Input - A method for user identification by using multiple sensing devices configured for sensing at least one characteristic associated with user's gestures. The method includes the steps of: receiving data from the sensing devices indicative of at least one gesture of a user; identifying at least one gesture and at least one characteristic thereof from the data from each sensing device, using at least one gesture recognition analysis process; and identifying the user according to the identified at least one gesture and characteristics, wherein these steps are carried out via at least one processor of at least one user device. | 03-26-2015 |
20150084860 | SYSTEMS, ARTICLES, AND METHODS FOR GESTURE IDENTIFICATION IN WEARABLE ELECTROMYOGRAPHY DEVICES - Systems, articles, and methods for performing gesture identification with improved robustness against variations in use parameters and without requiring a user to undergo an extensive training procedure are described. A wearable electromyography (“EMG”) device includes multiple EMG sensors, an on-board processor, and a non-transitory processor-readable storage medium that stores data and/or processor-executable instructions for performing gesture identification. The wearable EMG device detects, determines, and ranks features in the signal data provided by the EMG sensors and generates a digit string based on the ranked features. The permutation of the digit string is indicative of the gesture performed by the user, which is identified by testing the permutation of the digit string against multiple sets of defined permutation conditions. A single reference gesture may be performed by the user to (re-)calibrate the wearable EMG device before and/or during use. | 03-26-2015 |
20150084861 | DISPLAY APPARATUS AND METHOD OF CONTROLLING DISPLAY APPARATUS - A display apparatus and a method of controlling the display apparatus are provided. The method includes detecting a user when the display apparatus keeps a standby mode, in response to the detecting of the user, activating an area of a display unit to display at least one object, and in response to selection of one of the at least one object, controlling the display apparatus according to the selected object. | 03-26-2015 |
20150084862 | HEAD-MOUNTED DISPLAY, IMAGE DISPLAY SYSTEM, INFORMATION STORAGE DEVICE, AND METHOD FOR CONTROLLING HEAD-MOUNTED DISPLAY - A head-mounted display is worn on the head of the user, and allows the user to observe a display image, the head-mounted display including a display section that displays the display image, and a control section that performs a control process that controls the timing at which the display image is displayed on the display section. The control section performs the control process that causes a display period and a non-display period to repeat alternately, and sets one display period to be equal to or less than 600 ms, the display period being a period in which the display image is displayed on the display section, and the non-display period being a period in which the display image is not displayed on the display section. | 03-26-2015 |
20150091789 | ELECTRONIC DEVICE WITH A HEADS UP DISPLAY - Particular embodiments described herein provide for an electronic device that can include a circuit board coupled to a plurality of electronic components (which may include any type of components, elements, circuitry, etc.). One particular example implementation of an electronic device may include a display portion that includes: a display to be provided in front of an eye of a user; and a lens portion that includes a micro lens array and a convex lens, where the micro lens array and the convex lens cooperate in order to render a virtual image of an object to the user. | 04-02-2015 |
20150091790 | CLASSIFICATION OF GESTURE DETECTION SYSTEMS THROUGH USE OF KNOWN AND YET TO BE WORN SENSORS - An apparatus, a method, and a computer program product for gesture recognition. The apparatus classifies a gesture based on a movement of a body part as detected by a primary sensor. The apparatus determine a reliability level of a secondary sensor and obtains corroborating information associated with the movement of the body part using the secondary sensor when the reliability level satisfies a criterion. The apparatus then confirms or negates the classification of the gesture based on the corroborating information. The secondary sensor may be a sensor already known to the apparatus, i.e., the sensor is currently being worn by the user, or it may be a sensor that is worn by a user at a later time. In the latter case, the apparatus detects for the presence of a new sensor, determines the gesture recognition capabilities of the new sensor and integrates the new sensor into the gesture recognition process. | 04-02-2015 |
20150091791 | SYSTEMS AND METHODS FOR USING IMAGINED DIRECTIONS TO DEFINE AN ACTION, FUNCTION OR EXECUTION FOR NON-TACTILE DEVICES - A system and method for controlling a non-tactile device including a receiving device configured to receive signals corresponding to a user's brain waves or movements, the brain waves or movements corresponding to a series of directional intentions, the intentions defining at least one line pattern, a processor configured to process the at least one line pattern, each of said at least one line patterns associated with an action of the device, and output a control signal to the non-tactile device related to the action. | 04-02-2015 |
20150091792 | DISPLAY APPARATUS AND CONTROL METHOD THEREOF - Apparatuses and methods related to a display apparatus and a control method thereof, are provided. More particularly, the apparatuses and methods relate to a display apparatus and a control method thereof, in which brightness of an area of a screen is adjusted in consideration of an external light source reflected on the screen. | 04-02-2015 |
20150091793 | METHOD FOR CONTROLLING DEVICE ON THE BASIS OF EYEBALL MOTION, AND DEVICE THEREFOR - A method of controlling an operation of a display device using eye movements and a device for performing the method are provided. The method includes receiving eye movement information of a user; receiving blinking information of the user; generating a control command corresponding to the eye movement information and the blinking information; and controlling an operation of the display device based on the generated control command. | 04-02-2015 |
20150091794 | MOBILE TERMINAL AND CONTROL METHOD THEROF - A mobile terminal including a wireless communication unit configured to provide wireless communication; a camera configured to obtain an image of at least one user; a display configured to display the image obtained by the camera; and a controller configured to extract a gaze direction of the at least one user from the captured image, and display a guide image for guiding the gaze direction based on the extracted gaze direction on the display unit in proximate relationship with the camera. | 04-02-2015 |
20150091795 | DISPLAY APPARATUS AND METHOD OF CONTROLLING THE SAME - There is provided a display device which includes a display panel, a backlight that has a plurality of light sources and applies light to the display panel, a human body detector configured to detect a human body located around the display panel, a human body information acquisition unit configured to acquire information on the human body, and a control that controls an operation of a user recognition unit when the human body is detected in a standby mode, performs user recognition based on the acquired human body information, controls an operation of the display panel such that notification information of the user is displayed in a portion of the display panel, and turns some light sources out of the plurality of light sources on. According to the invention, for the user who views the display device, the local dimming is performed on the backlight which provides content information desired by the user utilizing only a portion of the display device. Therefore, it is possible to increase dynamic contrast and reduce power consumption more than in global dimming. Accordingly, the user may enter a personalized mode of a TV, and content information is provided to the user in the standby mode. | 04-02-2015 |
20150091796 | DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A display apparatus includes: a display configured to be bendable; a signal processor configured to process an image signal to display an image on the display; a detector configured to detect a bending state of the display; a camera configured to detect a sightline of a user; and a controller configured to control the signal processor to adjust a displayed state of the image for at least one of a first area and a second area in an entire display area of the display in response to on the detected bending state of the display being detected by the detector, wherein the first area corresponds to the detected sightline of the user detected by the camera, and wherein the second area is a different area from the first area. | 04-02-2015 |
20150091797 | APPARATUS, SYSTEM, AND METHOD FOR SIMULATING PHYSICAL MOVEMENT OF A DIGITAL IMAGE - An apparatus, system, and method are disclosed for simulating physical movement of a digital image. The apparatus includes an input receiving module, a calculation module, and a output module. The input receiving module is configured to receive a position input identifying a physical unit of measure. The calculation module is configured to correlate the physical unit of measurement to a position of a image positioning coordinate. The output module is configured to output the position of the image positioning coordinate. | 04-02-2015 |
20150097766 | ZOOMING WITH AIR GESTURES - An NUI system for mediating input from a computer-system user. The NUI system includes a logic machine and an instruction storage machine. The instruction-storage machine holds instructions that cause the logic machine to receive data tracking a change in conformation of the user including at least a hand trajectory of the user. If the data show increasing separation between two hands of the user, the NUI system causes a foreground process of the computer system to be displayed in greater detail on the display. If the data show decreasing separation between the two hands of the user, the NUI system causes the foreground process to be represented in lesser detail. | 04-09-2015 |
20150097767 | SYSTEM FOR VIRTUAL EXPERIENCE BOOK AND METHOD THEREOF - Provided are a virtual experience book system and a method for operating the same, the system including a virtual experience book producing device configured to generate 3D virtual image content in consideration of a plurality of objects extracted from text information of a story, and a virtual experience book control device configured to detect a body motion of a user while displaying the 3D virtual image content, and control a represented image of the 3D virtual image content so that a viewpoint is switched or arrangement locations of 3D models are changed according to the detected body motion. | 04-09-2015 |
20150097768 | ENHANCED FIELD OF VIEW TO AUGMENT THREE-DIMENSIONAL (3D) SENSORY SPACE FOR FREE-SPACE GESTURE INTERPRETATION - The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted. | 04-09-2015 |
20150097769 | PROGRAMMABLE, INTERACTIVE DISPLAY RECEPTACLE WITH USE MONITORING AND INDEPENDENT ACTIVATION, DEACTIVATION, AND CHANGE CAPABILITIES - A receptacle having a programmable, interactive visual display affixed to a surface of the receptacle. The receptacle includes the visual display, a programmable memory, and a controller. The memory stores data corresponding to one or more display images and/or text, and the controller controls the display for displaying the image/text data from the memory. The receptacle may further include an input mechanism for receiving at least one input, and the controller may control the display of images/text data in response to the input. The memory can also store at least one game or other program, and the controller can execute the game or other program from the memory, operate the game or other in response to one or more inputs received via the input mechanism, and control the display based on the requirements of the game or other programs. The display may include an audio component for producing audible sound. | 04-09-2015 |
20150097770 | METHOD AND APPARATUS FOR CONTROLLING MULTI-EXPERIENCE TRANSLATION OF MEDIA CONTENT - A method or apparatus for controlling a media device using gestures may include, for example, modifying media content to generate first updated media content according to a comparison of first information descriptive of a first environment of the source device to second information descriptive of a second environment of the recipient device, capturing images of a gesture, identifying a command from the gesture, and modifying the first updated media content to generate second updated media content according to the command. Other embodiments are disclosed. | 04-09-2015 |
20150097771 | INFORMATION PROCESSING APPARATUS AND OPERATION METHOD OF INFORMATION PROCESSING APPARATUS - An information processing apparatus includes: an apparatus body including a body having a keyboard and a display unit attached to the body so as to be opened and closed; an acceleration sensor mounted on the apparatus body; a gesture motion determination unit mounted on the apparatus body and determining a reference posture of a holding state of the apparatus body taken when a user executes a given gesture function while holding the apparatus body based on acceleration detected by the acceleration sensor as well as determining a gesture motion executed by the user by detecting posture change of the apparatus body from the reference posture based on acceleration detected by the acceleration sensor; and an operation execution unit mounted on the apparatus body and executing a given operation corresponding to the gesture motion executed by the user based on the determination result in the gesture motion determination unit. | 04-09-2015 |
20150102993 | PROJECTOR-CAMERA SYSTEM WITH AN INTERACTIVE SCREEN - A projector-camera system includes a projector coupled to back project a first image on a translucent diffusing screen. A camera is coupled to capture a second image from a back side of the translucent diffusing screen. The second image includes the first image back projected on the translucent diffusing screen and a shadow of a pointing device cast on a front side of the translucent diffusing screen. The pointing device is on the front side of the translucent diffusing screen and is in close proximity to the translucent diffusing screen. A processing block is coupled to the projector and the camera to generate a third image including the shadow of the pointing device. The processing block is further coupled to activate a command in a main computer coupled to the processing block in response to a relative position of the shadow of the pointing device in the third image. | 04-16-2015 |
20150102994 | SYSTEM AND METHOD FOR MULTI-TOUCH GESTURE DETECTION USING ULTRASOUND BEAMFORMING - Methods, systems, computer-readable media, and apparatuses for gesture detection using ultrasound beamforming are presented. In some embodiments, a method for gesture detection utilizing ultrasound beamforming includes projecting an ultrasound wave parallel to a surface, wherein the ultrasound wave is projected utilizing ultrasound beamforming. The method further includes receiving an ultrasound echo from an object in contact with the surface. The method additionally includes interpreting a gesture based at least in part on the received ultrasound echo. | 04-16-2015 |
20150102995 | AUTOMATIC VIEW ADJUSTMENT - A view adjustment system using information captured by one or more sensors on a client device determines a projection direction for content to be displayed on a display of the client device. Upon determining the projection direction, the view adjustment system transforms the content into a perspective view based on the determined projection direction and prompts the client device to present the content in the perspective view to a user. The view adjustment system may monitor changes in relative position and/or direction of the user with respect to the display, adjust the projection direction, and transform the content to reflect these changes. | 04-16-2015 |
20150102996 | DISPLAY APPARATUS AND POWER-SAVING PROCESSING METHOD THEREOF - A power-saving processing method and the display apparatus are provided. The method includes sensing an external environment of the display apparatus, determining a level of viewing concentration based on a result of the sensing, and displaying content by performing a power-saving processing operation at a level corresponding to the level of viewing concentration among a plurality of levels of the power-saving processing operation. | 04-16-2015 |
20150102997 | 3D INTERACTION APPARATUS, DISPLAY DEVICE INCLUDING THE SAME, AND METHOD OF DRIVING THE SAME - Provided are a three-dimensional (3D) interaction apparatus capable of recognizing a user's motions in a 3D space for performing a 3D interaction function, a display device including the 3D interaction apparatus, and a method of driving the 3D interaction apparatus. The 3D interaction apparatus includes a depth camera which obtains a depth image including depth information of a distance between an object and the depth camera; an active optical device disposed in front of the depth camera and configured to adjust a propagation path of light by refracting incident light so as to adjust a field of view of the depth camera; and a driving unit which controls operation of the active optical device. | 04-16-2015 |
20150102998 | PROJECTION-TYPE PROJECTOR, ANTI-GLARE METHOD, AND PROGRAM FOR ANTI-GLARE - A head detection means | 04-16-2015 |
20150102999 | DISPLAY APPARATUS - A display apparatus including an image module and a micro deflecting array is provided. The image module is configured to provide a plurality of image beams, wherein these image beams contain a plurality sets of image information of different viewing angles, and the micro deflecting array is disposed on the transmission paths of these image beams. The micro deflecting array has a plurality of micro deflecting units arranged in an array, and these micro deflecting units are grouped into a plurality of micro deflecting groups interlaced with each other, and these micro deflecting groups respectively deflect these image beams to a plurality of directions, and a distribution range of azimuth angles of these directions, with respect to an optical axis of the micro deflecting array, in a direction perpendicular to the optical axis of the micro deflecting array occupies at least a part of 360 degrees. | 04-16-2015 |
20150103000 | EYE-TYPING TERM RECOGNITION - Various embodiments related to entering text into a computing device via eye-typing are disclosed. For example, one embodiment provides a method that includes receiving a data set including a plurality of gaze samples, each gaze sample including a gaze location and a corresponding point in time. The method further comprises processing the plurality of gaze samples to determine one or more likely terms represented by the data set. | 04-16-2015 |
20150109191 | Speech Recognition - Example methods and systems activate and deactivate a voice interface based on gaze directions. A computing device can define a range of voice-activation gaze directions and, in some cases, define a range of social-cue gaze directions, where the range of social-cue gaze directions overlaps the range of voice-activation gaze directions. The computing device can determine a gaze direction. The computing device determines whether the gaze direction is within the range of voice-activation gaze directions. In response to determining that the gaze direction is within the range of social-cue directions, the computing device can activate a voice interface. In response to determining that the gaze direction is not within the range of social-cue directions, the computing device can deactivate the voice interface. | 04-23-2015 |
20150109192 | IMAGE SENSING SYSTEM, IMAGE SENSING METHOD, EYE TRACKING SYSTEM, EYE TRACKING METHOD - An image sensing system comprising: a controller; and an image sensing module. If the image sensing module performs a low resolution image sensing and the controller senses a trigger operation, the controller controls the image sensing module to switch from the low resolution image sensing to a high resolution image sensing. | 04-23-2015 |
20150109193 | USE OF HUMAN INPUT RECOGNITION TO PREVENT CONTAMINATION - Embodiments of a system and method for processing and recognizing non-contact types of human input to prevent contamination are generally described herein. In example embodiments, human input is captured, recognized, and used to provide active input for control or data entry into a user interface. The human input may be provided in variety of forms detectable by recognition techniques such as speech recognition, gesture recognition, identification recognition, and facial recognition. In one example, the human input recognition techniques are used in connection with a device cleaning workflow used to obtain data and human input during cleaning procedures while minimizing cross-contamination between the contaminated device or person and other objects or persons. In another example, the human input recognition techniques are used in connection with a device tracking workflow used to obtain data and human input while tracking interactions with and locations of the contaminated or uncontaminated device. | 04-23-2015 |
20150109194 | INFORMATION PROCESSING DEVICE, CONTROL METHOD THEREOF, AND PROGRAM - An information processing device includes: an input unit that receives input; a detecting unit that detects the posture of the information processing device; and, a control unit that controls the operation of the information processing device in accordance with the posture of the information processing device which the detecting unit detects after confirming the presence or absence of a particular input via the input unit. | 04-23-2015 |
20150109195 | INFORMATION PROCESSING TERMINAL AND PROCESSING METHOD THEREFOR - An information processing terminal and a processing method therefor are provided. The information processing terminal comprises an execution component, a manipulation component and a display component, wherein the manipulation component is used as an external manipulation device of the execution component. The execution component also comprises a preprocessing module and an operation conversion module. The preprocessing module is used for preprocessing a control signal generated by each control module of the manipulation component, converting same into a signal corresponding to the control signal input in each application program, and generating a respective preprocessing record for each application program. The operation conversion module is used for converting the control signal generated by each control module of the manipulation component according to the preprocessing record of a certain application program during the running process of the application program. | 04-23-2015 |
20150109196 | GESTURE CONTROL - The present invention relates to devices, system and method for detecting gestures. The devices, systems and methods uses optically shape sensing devices for tracking and monitoring users. This allows unhindered, robust tracking of persons in different setting. The devices, systems and methods are especially useful in health care institutions. | 04-23-2015 |
20150109197 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM - There is provided an information processing apparatus including an imaging unit configured to shoot an image showing that a phalanx of at least one or more fingers of a hand is in contact with a contacting object, an identification unit configured to identify an input element among a plurality of input elements, the input element corresponding to the phalanx in contact with the contacting object in a captured image from the imaging unit, and an output unit configured to output an identification result from the identification unit to a processing apparatus that performs input processing corresponding to the plurality of input elements. | 04-23-2015 |
20150109198 | PROJECTOR, IMAGE OUTPUT APPARATUS, AND CONTROL METHOD THEREOF - A projector, an image output apparatus, and a control method thereof are provided. In the method, a search signal transmitted by a controller is received, and accordingly at least one user apparatus supporting a wireless protocol is searched from surroundings by using the wireless protocol and a wireless connection is established with each user apparatus. Then, a control signal transmitted by the controller is received, and accordingly one of the at least one user apparatus is selected. An image provided by the selected user apparatus is received through the wireless connection and projected. | 04-23-2015 |
20150109199 | System and Method for Display Management Based on User Attention Inputs - A system and method are provided for managing data being displayed on at least one monitor screen based on monitoring user's attention in relation to the monitor screen. In one embodiment, upon detecting that the user's attention is leaving at least a portion of a screen, the system may alert the user of such an event. Alternatively, the system could alert the user upon detecting a triggering condition while the user's attention is away from the at least a portion of the screen. The step of alerting the user may include modifying at least a portion of a display on a monitor not being viewed by the user. Additionally, the system may initiate preparation of a report including any data not being viewed by a user during the time period when the user is not viewing a portion of the monitor. | 04-23-2015 |
20150109200 | IDENTIFYING GESTURES CORRESPONDING TO FUNCTIONS - Disclosed herein are a method, electronic device, and non-transitory computer readable medium for detecting gestures. A head gesture is detected. It is identified whether the head gesture corresponds to a function based at least partially on an image pattern, an angular velocity pattern, and an acceleration pattern of the head gesture. The function corresponding to the head gesture is performed, when the head gesture corresponds to a function. | 04-23-2015 |
20150109201 | SEMICONDUCTOR DEVICE - Provided is a device capable of displaying an image with small power consumption by adjusting output of the device in accordance with the physical and mental state of a user, such as tiredness or excitement. An electronic device includes one or a plurality of sensor portions which collects data from a user; a chaotic attractor generation portion which calculates a chaotic attractor by arithmetic processing of the collected data; and a Lyapunov index generation portion which calculates an index indicating the degree to which the chaotic attractor is matched to chaos definition conditions. Output of the electronic device is changed in accordance with the physical and mental state of a user. | 04-23-2015 |
20150109202 | SYSTEMS, ARTICLES, AND METHODS FOR GESTURE IDENTIFICATION IN WEARABLE ELECTROMYOGRAPHY DEVICES - Systems, articles, and methods perform gesture identification with limited computational resources. A wearable electromyography (“EMG”) device includes multiple EMG sensors, an on-board processor, and a non-transitory processor-readable memory that stores data and/or processor-executable instructions for performing gesture identification. The wearable EMG device detects and determines features of signals when a user performs a physical gesture, and processes the features by performing a decision tree analysis. The decision tree analysis invokes a decision tree stored in the memory, where storing and executing the decision tree may be managed by limited computational resources. The outcome of the decision tree analysis is a probability vector that assigns a respective probability score to each gesture in a gesture library. The accuracy of the gesture identification may be enhanced by performing multiple iterations of the decision tree analysis across multiple time windows of the EMG signal data and combining the resulting probability vectors. | 04-23-2015 |
20150109203 | PERFORMING METHOD OF DEVICE CAPABLE OF ADJUSTING IMAGES ACCORDING TO BODY MOTION OF USER - A performing method of a device capable of adjusting images according to a body motion of a user, the device comprising a display unit, at least one camera unit, and a control unit electrically coupled to the display unit and the camera unit, the performing method comprising: capturing at least one motion image of the user by the at least one camera unit; receiving the at least one motion image of the user by the control unit; and adjusting the display unit according to the least one motion image of the user by the control unit, whereby the images displayed by the display unit moving as does the user. | 04-23-2015 |
20150109204 | HUMAN-MACHINE INTERACTION METHOD AND APPARATUS - Embodiments of the present invention disclose a human-machine interaction method and apparatus. The method includes: capturing a line-of-sight direction; determining a change of an angle between a line of sight and a screen, and/or a change of a distance between a user and the screen according to the line-of-sight direction; and performing, according to the change of the angle and/or the change of the distance, a corresponding operation on content displayed on the screen. Corresponding to the foregoing method, the foregoing apparatus includes: a line-of-sight tracking unit, a processing unit, and an executing unit. In the present invention, an action of a user can be determined without depending on gravity sensing, and a corresponding operation can be performed on content displayed on a screen. | 04-23-2015 |
20150109205 | DISPLAY APPARATUS AND DISPLAY METHOD THEREOF - A display method includes displaying content on a screen and if a non-contact user motion having directivity is input, analyzing a direction of input non-contact user motion and dividing the screen into plural screens according to the directivity of the non-contact user motion. | 04-23-2015 |
20150116196 | LED DISPLAY MODULE, AN LED TV AND AN LED TV SYSTEM - The present invention discloses an LED display module, an LED TV and an LED TV system, wherein a camera component is embedded on the LED display module; in this way, the LED TV is capable of taking pictures from middle of the display screen; because the participants are looking at the display screen, as a result, the local participants at local and the other end can look in the eyes, which improves sensory feeling of video interaction and realizes getting more information from the eye expression; because the camera component can take pictures from the front side, accuracy of taken feature information is improved; and the accurate feature information can facilitate accurate subsequent action analysis; as a result, the present invention realizes wider application of LED TV. | 04-30-2015 |
20150116197 | SYSTEMS AND METHODS FOR DISPLAYING THREE-DIMENSIONAL IMAGES ON A VEHICLE INSTRUMENT CONSOLE - A system includes a gaze tracker configured to provide gaze data corresponding to a direction that an operator is looking. One or more processors are configured to analyze the gaze data to determine whether a display is in a central vision of the operator or whether the display is in a peripheral vision of the operator. The processors are further configured to provide a first type of image data to the display if the display is in the central vision and a second type of image data to the display if the display is in the peripheral vision. The first type of image data includes first three-dimensional (3D) image data that produces a first 3D image when the display is within the central vision. The second type of image data includes second 3D image data that produces a second 3D image when the display is within the peripheral vision. | 04-30-2015 |
20150116198 | DEVICE AND METHOD FOR DISPLAYING MULTIMEDIA CONTENT - Various aspects of a device and a method for displaying multimedia content are disclosed herein. The method is included in a display device. The method determines a first orientation information of a first viewing position with respect to a display screen associated with the display device. The display screen displays the multimedia content having one or more objects. The method displays a mirror image of the one or more objects on the display screen based on the determined first orientation information. | 04-30-2015 |
20150116199 | HEAD MOUNTED DISPLAY AND IMAGING METHOD THEREOF - A head mounted display (HMD) and an imaging method thereof are provided. The HMD comprises a beam splitter, a pico projector, an application processor, an eye image sensor, an adjustment apparatus, an application specific integrated circuit (ASIC), and an eyeglass frame. The application processor controls the pico projector to project a beam. The eye image sensor captures an eye image. The ASIC performs a perspective correction on the eye image to generate a frontal eye image. The ASIC obtains a beam location of the beam and obtains a pupil location according to the frontal eye image. The ASIC according to the beam location and the pupil location controls the adjustment apparatus to adjust the beam splitter. The eyeglass frame carries the beam splitter, the pico projector, the application processor, the eye image sensor and the ASIC. | 04-30-2015 |
20150116200 | SYSTEM AND METHOD FOR GESTURAL CONTROL OF VEHICLE SYSTEMS - A method and system for gestural control of a vehicle system including tracking a motion path of a grasp hand posture upon detecting an initiation dynamic hand gesture in a spatial location associated with the motorized vehicle system, wherein the initiation dynamic hand gesture is a sequence from a first open hand posture to the grasp hand posture, controlling a feature of the vehicle system based on the motion path and terminating control of the feature upon detecting a termination dynamic hand gesture, wherein the termination dynamic hand gesture is a sequence from the grasp hand posture to a second open hand posture. | 04-30-2015 |
20150116201 | METHOD AND APPARATUS FOR MARKING ELECTRONIC DOCUMENT - A method and an apparatus for marking electronic document are provided. An image sequence of a user is captured by an image capturing unit, and an eye tracking procedure, a nod detecting procedure and a blink detecting procedure are executed for analyzing the image sequence. A first position on the electronic document where a current sightline of the user falls when the user performs a first specific action is obtained, and a marking action is executed for marking the electronic document along with a movement trajectory of the user's eyes starting from the first position. A second position on the electronic document where the current sightline falls when the user performs a second specific action is obtained, and the marking action is finished at the second position. | 04-30-2015 |
20150116202 | IMAGE PROCESSING DEVICE AND METHOD, AND PROGRAM - The present technology relates to an image processing device and a method thereof, and a program which can present a more natural stereoscopic image. | 04-30-2015 |
20150116203 | IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM - An apparatus includes a receiver configured to receive from a detector a location of a gazing point of a user on an image, and an image processor configured to perform an adjustment process to adjust visual characteristics of the image based on the location of the gazing point of the user on the image. | 04-30-2015 |
20150116204 | TRANSPARENT DISPLAY VIRTUAL TOUCH APPARATUS NOT DISPLAYING POINTER - The present invention is to provide a transparent display virtual touch apparatus capable of exquisitely operating by a display portion worn on a user's face and located in front of an eye of a user and of identifying contents regardless of the direction and location of the user. The present invention includes a transparent display portion, worn on a user's face and located in front of an eye of a user, for displaying contents on a display, a first image obtaining portion, attached to one side of the transparent display portion, for capturing a location for the eye of the user, a second image obtaining portion, attached to one side of the transparent display portion, for capturing the body of the user, and a virtual touch processing portion for detecting first space coordinates and second space coordinates using each calculated 3D coordinates data using images captured by the first image obtaining portion and the second image obtaining portion and for calculating contact point coordinates data for a display surface on the transparent display portion met by a line connecting the first space coordinates and second space coordinates. | 04-30-2015 |
20150116205 | THRESHOLDS FOR DETERMINING FEEDBACK IN COMPUTING DEVICES - The present application is related to a computer for providing output to a user. The computer includes a processor and an input device in communication with the processor. The input device includes a feedback surface and at least one sensor in communication with the feedback surface, the at least one sensor configured to detect a user input to the feedback surface. The processor varies a down-stroke threshold based on a first factor and varies an up-stroke threshold based on a second factor. The down-stroke threshold determines a first output of the computing device, the up-stroke threshold determines a second output of the computing device, and at least one of the first factor or the second factor are determined based on the user input. | 04-30-2015 |
20150116206 | SCREEN OPERATION APPARATUS AND SCREEN OPERATION METHOD - A screen operation apparatus for enabling operation of a screen by an operator, the screen operation apparatus comprising: a processor configured to operate as an image input unit configured to obtain images of a face of the operator at a predetermined time interval; a focus condition determination unit configured to, using the images obtained by the image input unit, determine whether or not the operator is focusing on the screen; a face direction condition determination unit configured to, using the images obtained by the image input unit, determine whether or not a face direction of the operator satisfies a predetermined condition; and a screen operation unit configured to execute a predetermined screen operation when the face direction condition determination unit determines that the face direction of the operator satisfies the predetermined condition and the focus condition determination unit determines that the operator is focusing on the screen. | 04-30-2015 |
20150116207 | ELECTRONIC DEVICE AND CONTROL METHOD FOR SCREEN THEREOF - The disclosure invention provides an electronic device, which comprises a screen, an imaging lens, a light sensor and a control unit. The imaging lens is provided for capturing images in front of the screen. The light sensor is provided for sensing an ambient light. The control unit is provided for determining whether human eyes gaze at the screen in the images. When the control unit determines that the human eyes gaze at the screen over a predetermined time, it further adjusts the screen to enter an eye protection mode according to the ambient light, in which the eye protection mode comprises an adjustment of brightness, contrast, color or refresh rate of the screen. | 04-30-2015 |
20150116208 | TERMINAL APPARATUS, INFORMATION PROCESSING APPARATUS, AND DISPLAY CONTROL METHOD - A control unit displays, on a display device, a screen according to screen data received from an information processing apparatus. At the same time, the control unit receives a next input operation predicted by the information processing apparatus and screen data according to the predicted next input operation from the information processing apparatus and stores them in a memory. In addition, when, upon detecting a next input operation, the next input operation matches the predicted next input operation, the control unit displays, on the display device, a screen based on the screen data stored in the memory. | 04-30-2015 |
20150116209 | ELECTRONIC DEVICE AND METHOD FOR CONTROLLING BUTTONS OF ELECTRONIC DEVICE - In a method for controlling buttons of an electronic device, a facial image of a user is captured by an image capturing unit installed in the electronic device. The method determines a button of the electronic device which corresponds to a point of focus on the electronic device, based on the facial image. When an audio signal of the user is detected by an audio collection unit installed in the electronic device, a control command is recognized from the audio signal, and a function of the button is activated to perform the required function of the electronic device based on the control command. | 04-30-2015 |
20150116210 | EXHIBITION DEVICE - An exhibition device includes a support stage, a display panel, a position sensor, a camera module, a memory, and a processor. The support stage includes a support surface defining a support area for supporting an exhibition object. The position sensor is aligned with the support area. The position sensor detects whether the exhibition object is picked up. The camera module is located on the display panel and orientates toward the support area to capture images in real time. The memory stores basic information and detail information of the exhibition object. The processor analyzes the images to determine whether a face feature exists in the images, and controls the display panel to display the basic information if the sizes of the face feature become bigger, and controls the display panel to display the detail information when the exhibition object is picked up. | 04-30-2015 |
20150116211 | METHOD AND SYSTEM FOR CONTROLLING EXTERNAL OUTPUT OF A MOBILE DEVICE - A method and system is provided that controls an external output function of a mobile device according to control interactions received via the microphone. The method includes, activating a microphone according to preset optional information when the mobile device enters an external output mode, performing an external output operation in the external output mode, detecting an interaction based on sound information in the external output mode, and controlling the external output according to the interaction. | 04-30-2015 |
20150116212 | VIEWER-BASED DEVICE CONTROL - A computing device can monitor the gaze direction of people around the device to determine whether any unintended viewers are viewing content displayed on the device. A user can activate a privacy mode of the device, such that when an unintended viewer is detected the device can take an appropriate action. In some cases the device can notify the primary user of the device using audible, visual, or tactile alerts. In other cases, the device can modify the display of content such that the unintended viewer is unable to decipher or view the content, or is otherwise notified of the detection. | 04-30-2015 |
20150116213 | INPUT UNIT - There is provided an input unit adapted for non-contact input manipulation, which permits a user to smoothly accomplish an intended input manipulation. The input unit includes: a position detecting portion for detecting a position of a manipulating object such as a user's hand manipulating the input unit; a position change detecting portion for detecting a change in the position of a point on the manipulating object based on a detection output from the position detecting portion, the point being the closest to the position detecting portion; and an image display section. The position change detecting portion detects the change in the position of the point closest to the position detecting portion in a predetermined area. The image display section changes the display image according to a detection output from the position change detecting portion. | 04-30-2015 |
20150123889 | Electronic Device with Orientation Detection and Methods Therefor - A portable electronic device ( | 05-07-2015 |
20150123890 | TWO HAND NATURAL USER INPUT - Embodiments are disclosed which relate to two hand natural user input. For example, one disclosed embodiment provides a method comprising receiving first hand tracking data regarding a first hand of a user and second hand tracking data regarding a second hand of the user from a sensor system. The first hand tracking data and the second hand tracking data temporally overlap. A gesture is then detected based on the first hand tracking data and the second hand tracking data, and one or more aspects of the computing device are controlled based on the gesture detected. | 05-07-2015 |
20150123891 | METHODS FOR AUTOMATICALLY ASSESSING USER HANDEDNESS IN COMPUTER SYSTEMS AND THE UTILIZATION OF SUCH INFORMATION - In some embodiments, a system and/or method may assess handedness of a user of a system in an automated manner. The method may include displaying a 3D image on a display. The 3D image may include at least one object. The method may include tracking a position and an orientation of an input device in open space in relation to the 3D image. The method may include assessing a handedness of a user based on the position and the orientation of the input device with respect to at least one of the objects. In some embodiments, the method may include configuring at least a portion of the 3D image based upon the assessed handedness. The at least a portion of the 3D image may include interactive menus. In some embodiments, the method may include configuring at least a portion of an interactive hardware associated with the system based upon the assessed handedness. | 05-07-2015 |
20150123892 | LOCATING METHOD, LOCATING DEVICE, DEPTH DETERMINING METHOD AND DEPTH DETERMINING DEVICE OF OPERATING BODY - A locating method, a locating device, a depth determining method, and a depth determining device of an operating body are provided. The locating method includes following steps: deriving an image that includes an operating body; scanning the image transversely according to each of scan lines of the image, and deriving each of width values of the operating body in each of the scan lines according to the bright dot information in each of the scan lines of the image; calculating a relative variation level between each of the width values in the adjacent scan line sequentially; when the relative variation level is less than a threshold, determining a locating point of the operating body according to one of the scan lines and one of the width values corresponding to the relative variation level. | 05-07-2015 |
20150123893 | REMOTE CONTROLLER FOR MOTION RECOGNITION - The present invention relates to a method for enabling motion recognition without recognizing a hand and a remote controller for motion recognition which can suggest an efficient motion and can be implemented at a low cost in a more stable manner. The remote controller for motion recognition, according to the present invention, comprises: an image acquisition unit for acquiring consecutive images of a hand; a difference image configuration unit for obtaining difference images of the acquired consecutive images; a hand trajectory analysis unit for analyzing a moving trajectory of the hand from the difference images; a motion recognition unit for recognizing a hand motion by analyzing the hand trajectory; and a remote signal transmitter for transmitting a remote signal corresponding to the recognized hand motion such that a device to be controlled can be controlled. | 05-07-2015 |
20150123894 | DIGITAL DEVICE AND CONTROL METHOD THEREOF - Disclosed are a digital device and a control method thereof The digital device includes a communication unit to transmit/receive a signal with an external device; a gesture sensor unit to sense a gesture with respect to the digital device; and a processor to control the communication unit and the gesture sensor unit, and to provide a first mode corresponding to a first event when occurrence of the first event is detected; transmit a first signal, commanding provision of a second mode corresponding to a second event, to the external device when occurrence of the second event is detected during provision of the first mode; and switch from the first mode to the second mode when detecting a first gesture with respect to the digital device after transmission of the first signal. | 05-07-2015 |
20150123895 | IMAGE DISPLAY SYSTEM, METHOD OF CONTROLLING IMAGE DISPLAY SYSTEM, AND HEAD-MOUNT TYPE DISPLAY DEVICE - An image display system includes a head-mount type display device, and an input device adapted to operate the head-mount type display device, the input device includes a motion detection section adapted to detect a motion of a finger of a user, and the head-mount type display device includes an operation control section adapted to make the user visually recognize a virtual operation section as a virtual image, the virtual operation section being used for operating the head-mount type display device, and corresponding to the motion of the finger detected. | 05-07-2015 |
20150123896 | DATA PROCESSOR AND METHOD FOR DISPLAYING DATA THEREBY - A novel data processor which can display a plurality of images arranged in a predetermined order, a novel method for displaying data, or a novel program is provided. The data processor includes an input/output unit which supplies operation instructions, an arithmetic unit which determines data marked as a starting point according to the operation instructions to generate image data, and a display portion which displays the image data. | 05-07-2015 |
20150123897 | GESTURE DETECTION SYSTEM, GESTURE DETECTION APPARATUS, AND MOBILE COMMUNICATION TERMINAL - A gesture detection system having a gesture detection apparatus to detect a gesture of a user, and a mobile communication terminal that can communicate with the gesture detection apparatus, includes a storage unit to store first gesture data defining the gesture of the user, and audio or visual data associated with the first gesture data; an obtainment unit to obtain second gesture data representing the gesture of the user; a transmission unit to transmit the second gesture data to the mobile communication terminal; a determination unit to determine whether the gesture defined by the first gesture data is the same as the gesture represented by the second gesture data; a selection unit to select the audio or visual data associated with the first gesture data depending on the determination result; and an output unit to output the audio or visual data. | 05-07-2015 |
20150123898 | DIGITAL DEVICE AND CONTROL METHOD THEREOF - Disclosed are a digital device and a control method thereof. The digital device comprising: a communication unit configured to transmit/receive a signal with an external device; a gesture sensor unit configured to sense a gesture with respect to the digital device; and a processor configured to control the communication unit and the gesture sensor unit, wherein the processor is further configured to: provide a first mode corresponding to a first event when occurrence of the first event is detected; transmit a first signal, commanding provision of a second mode corresponding to a second event, to the external device when occurrence of the second event is detected during provision of the first mode; and switch from the first mode to the second mode when detecting a first gesture with respect to the digital device after transmission of the first signal. | 05-07-2015 |
20150130695 | CAMERA BASED AUTO SCREEN ROTATION - Systems and methods may provide for receiving an image of a user of a mobile device and analyzing the image to determine whether a rotation condition is present with respect to the mobile device relative to a face of the user. Additionally, content on a screen of the mobile device may be rotated if the rotation condition. In one example, the analysis of the image includes identifying one or more of a head shape and an eye position in the image, wherein the image is received while the user is lying down. | 05-14-2015 |
20150130696 | USE OF LIGHT TRANSMISSION THROUGH TISSUE TO SENSE JOINT FLEXURE - Various embodiments relate to apparatuses and methods of using light transmission thought living tissue, such as a finger, to detect the flexure of a joint. Light is introduced into the tissue at one point, passes through the tissue, and exits the tissue at a second point where a sensor receives the light as it exits the tissue. Transmission of light through living tissue such as a finger can be affected by movement of the finger. As the finger flexes and, for example, the joints of the finger change angle, the characteristics of the light exiting the tissue, such as the intensity of the light, can change. These changes in characteristics can be used as an indirect means of determining the flexure of the joint. | 05-14-2015 |
20150130697 | USE OF LIGHT TRANSMISSION THROUGH TISSUE TO DETECT FORCE - Various embodiments relate to apparatuses and methods of using light transmission thought compressed living tissue to detect force. Transmission of light through living tissue such as a finger is affected by how much the tissue is compressed, for example by the finger being pressing on a surface. Light is introduced into the tissue, passes through the tissue, and a sensor receives the light exiting the tissue. The compression of the tissue can be determined using various characteristics of the received light, such as the light intensity, as determined based at least partly on sensor readings. | 05-14-2015 |
20150130698 | WEARABLE GLOVE ELECTRONIC DEVICE - A glove electronic device includes a plurality of peripherals, each of which is affixed to an adhesive cover and attachable to a location in at least one of finger portions, a wrist portion and a hand portion of the glove electronic device. Configuration of the plurality of peripherals on the glove electronic device is adaptable during use of the glove electronic device. The device also includes a transceiver for sending information captured by one or more of the plurality of peripherals and for transmitting information to one or more of the plurality of peripherals. The device further includes a processor configured to operate one or more of the plurality of peripherals responsive to a movement associated with the glove electronic device or body signals being sensed from sensors attached or worn on the body. | 05-14-2015 |
20150130699 | HEAD MOUNTED DISPLAY AND METHOD OF CONTROLLING THEREFOR - The present specification relates to a head mounted display and a method of controlling therefor, and more particularly, when a user wearing the head mounted display captures an image, a method of storing a captured image according to whether a preview via an image preview interface exists or not. | 05-14-2015 |
20150130700 | ASSEMBLED ELECTRONIC APPARATUS AND CONTROL METHOD THEREOF - An assembled electronic apparatus and a control method thereof are provided. The assembled electronic apparatus includes a first body and a second body. A second processor shares a partial content of a sensing record generated by a sensing module through a second information sharing module. The first body and the second body are connected with each other through the first connector and the second connector. After being connected with each other, a message is transmitted by one of the first processor and the second processor to another one of the first processor and the second processor, so that a function is executed by the another one of the first processor and the second processor through the corresponding first processor or the corresponding second processor according to the message. The first processor shares a content of the sensing record generated by the sensing module through a first information sharing module. | 05-14-2015 |
20150130701 | ARRANGEMENT FOR PHYSICALLY MOVING TWO DIMENSIONAL, THREE DIMENSIONAL AND/OR STEREOSCOPIC THREE DIMENSIONAL VIRTUAL OBJECTS - An arrangement for moving virtual object images within a real space. The arrangement includes a virtual object image creator. A tangible object in electrical and/or electronic communication with the virtual object image creator such that when a virtual object image is generated by the virtual object image creator, electrical and/or electronic communication between the tangible object and the generated virtual object image is such, that upon movement of the tangible object there is a translation to corresponding movement of the generated virtual object image within the real space. | 05-14-2015 |
20150130702 | INFORMATION PROCESSING APPARATUS, CONTROL METHOD, AND PROGRAM - There is provided an information processing apparatus including an information acquiring unit provided in the information processing apparatus to be worn on a head of a user, the information acquiring unit being configured to acquire facial expression information of the user, and a control unit configured to perform certain control depending on whether or not the facial expression information of the user acquired by the information acquiring unit agrees with facial expression information of a partner. | 05-14-2015 |
20150130703 | System and Method for Dynamic Content Delivery Based on Gaze Analytics - A method of presenting content to a subject based on eye position measurements is provided. The method includes presenting the subject with content. While presenting the content to the subject, one or more of the subject's eye positions is measured. The method further includes continuously performing the operations of generating a variability metric using the one or more measured eye positions and comparing the variability metric with a predetermined baseline to determine an attention state of the subject. Upon detection of a change in the attention state, the presentation of the content is modified. | 05-14-2015 |
20150130704 | FACE TRACKING FOR ADDITIONAL MODALITIES IN SPATIAL INTERACTION - A user device receives an image stream from the user side of the user device and an image stream from a target side of the user device. The user device acquires a coordinate system for the user, acquires its own coordinate system, and relates the two coordinate systems to a global coordinate system. The user device then determines whether the user has moved and/or whether the user device has moved. Movement of the user and/or the user device is used as input modalities to control the user's interactions in the augmented reality environment. | 05-14-2015 |
20150130705 | METHOD FOR DETERMINING LOCATION OF CONTENT AND AN ELECTRONIC DEVICE - A method is provided comprising: detecting, by an electronic device, whether at least a threshold amount of eye-gaze data is collected; selecting a first area in a display of the electronic device based on the eye-gaze data when the threshold amount of eye-gaze data is collected; and displaying a content item in the first area | 05-14-2015 |
20150130706 | HAPTIC TRIGGER CONTROL SYSTEM - A system that controls a haptic effect experienced at a trigger is provided. The system receives a haptic effect definition including haptic data. The system further receives trigger data including at least one of: a position of a trigger of a peripheral device; or a range of the trigger of the peripheral device. The system further determines whether a trigger condition is reached based on the received trigger data. The system further sends a haptic instruction and the haptic effect definition to the peripheral device when the trigger condition is reached. The system further causes a haptic output device (or multiple haptic output devices) to produce haptic effects that are based on the haptic effect definition at the peripheral device in response to the haptic instruction. | 05-14-2015 |
20150130707 | HAPTIC SPATIALIZATION SYSTEM - A system is provided that controls a haptic effect experienced at a peripheral device. The system receives a haptic effect definition including haptic data. The system further receives spatialization data including: a distance of the haptic effect; a direction of the haptic effect; or a flow of the haptic effect. The system further includes modifying the haptic effect definition based on the received spatialization data. The system further includes sending a haptic instruction and the modified haptic effect definition to the peripheral device. The system further includes causing one or more haptic output devices to produce one or more haptic effects based on the modified haptic effect definition at the peripheral device in response to the haptic instruction. | 05-14-2015 |
20150130708 | METHOD FOR PERFORMING SENSOR FUNCTION AND ELECTRONIC DEVICE THEREOF - An electronic device and a method for performing a plurality of sensor functions are provided. The method includes detecting at least one of a subject brightness and a peripheral brightness with a sensor formed with a plurality of pixels, reading out a preset pixel group of the plurality of pixels, and performing at least one sensor function based on the read out pixel group. | 05-14-2015 |
20150130709 | GESTURE DETECTING DEVICE, GESTURE RECOGNITION DEVICE - An object of the present invention is to make it possible for smart devices such as smartphones to provide new operability to users. A gesture detecting device includes a motion detecting unit and a processor. The processor calculates deviation value outputted by the motion detecting unit. At timing when the deviation value exceeds a threshold value at the time of shaking out, the processor obtains a direction determination value at the time of shaking out, whereas at timing when the deviation value falls below a threshold value at the time of shaking back the smartphone, the processor obtains a direction determination value at the time of shaking back. Further, on the basis of a combination of the direction determination value, the processor detects a gesture of the gesture detecting device. | 05-14-2015 |
20150130710 | STATE-BASED AUXILIARY DISPLAY OPERATION - Described is a technology by which routing of data may be automatically modified based on detected state data of a computing system. For example, user input may be routed from an actuator set to a host computer system when the host computer system is in an online state, or to an auxiliary computing device when the host computer system is offline. State may be determined based on one or more various criteria, such as online or offline, laptop lid position, display orientation, current communication and/or other criteria. The auxiliary display and/or actuator set may be embedded in the host computer system, or each may be separable from it or standalone, such as a remote control or cellular phone. | 05-14-2015 |
20150130711 | HEAD MOUNTED DISPLAY AND METHOD OF CONTROLLING THEREFOR - A method of controlling a head mounted display (HMD) according to one embodiment of the present specification includes performing a first operation, receiving a first voice input through an audio input unit, processing the first voice input with respect to the first operation while a first contact is detected through a first sensor positioned at a nose pad of the HMD, detecting the first contact being released through the first sensor positioned at a nose pad of the HMD, receiving a second voice input through the audio input unit while the first contact is released, and performing a second operation according to the received second voice input. | 05-14-2015 |
20150138061 | SYNCHRONOUS COMMUNICATION SYSTEM AND METHOD - A method and computing system for providing, using one or more computing devices, a synchronous communication session for a plurality of users of a social network. A first video stream of a first user of the plurality of users is rendered within a primary viewing field associated with the synchronous communication session. A swiping gesture is received proximate the primary viewing field. | 05-21-2015 |
20150138062 | Pressure Sensing Via Bone Conduction - Concepts and technologies are disclosed herein for pressure sensing via bone conduction. According to one aspect, a device can receive a modified signal after a signal has propagated through a body of a user and a surface with which the user is in contact. The modified signal can include the signal as modified by the body of the user and the surface. The device can compare the modified signal to a baseline signal. The device can determine, based upon the comparison of the modified signal to the baseline signal, a change between the modified signal and the baseline signal. The device can determine, based upon the change between the modified signal and the baseline signal, a pressure applied by the user to the surface. The pressure can be used for various applications. | 05-21-2015 |
20150138063 | MOTION CONTROL OF A VIRTUAL ENVIRONMENT - An optical flow of depth video of a depth camera imaging a human subject is recognized. An energy field created by motion of the human subject is generated as a function of the optical flow and specified rules of a physical simulation of the virtual environment. The energy field is mapped to a virtual position in the virtual environment. A property of a virtual object in the virtual environment is adjusted based on a plurality of energy elements of the energy field in response to the virtual object interacting with the virtual position of the energy field. | 05-21-2015 |
20150138064 | Systems and Methods for Performing Multi-Touch Operations on a Head-Mountable Device - Embodiments described herein may provide a configuration of input interfaces used to perform multi-touch operations. An example device may involve: (a) a housing arranged on a head-mountable device, (b) a first input interface arranged on either a superior or an inferior surface of the housing, (c) a second input interface arranged on a surface of the housing that is opposite to the first input interface, and (d) a control system configured to: (1) receive first input data from the first input interface, where the first input data corresponds to a first input action, and in response, cause a camera to perform a first operation in accordance with the first input action, and (2) receive second input data from the second input interface, where the second input data corresponds to a second input action(s) on the second input interface, and in response, cause the camera to perform a second operation. | 05-21-2015 |
20150138065 | HEAD-MOUNTED INTEGRATED INTERFACE - A head mounted integrated interface (HMII) is presented that may include a wearable head-mounted display unit supporting two compact high resolution screens for outputting a right eye and left eye image in support of the stereoscopic viewing, wireless communication circuits, three-dimensional positioning and motion sensors, and a processing system which is capable of independent software processing and/or processing streamed output from a remote server. The HMII may also include a graphics processing unit capable of also functioning as a general parallel processing system and cameras positioned to track hand gestures. The HMII may function as an independent computing system or as an interface to remote computer systems, external GPU clusters, or subscription computational services, The HMII is also capable linking and streaming to a remote display such as a large screen monitor. | 05-21-2015 |
20150138066 | GAZE DETECTING APPARATUS AND METHOD - A gaze detection apparatus and method are provided. The method includes storing, by a controller, a reference gaze, a first reference vector, and a second reference vector. An image is then captured by an imaging device that includes user eyes. In addition, the controller detects central points of a pupil and an eyeball from the captured image and generates a gaze vector for one eyeball and a normal vector of a plane at which three central points are formed, when the three central points are detected. A first rotation angle and a second rotation angle are calculated by comparing the gaze vector and the normal vector, respectively, with the first reference vector and the second reference vector and a mean value thereof is calculated. Further, a gaze is then detected by rotating the reference gaze as much as the mean value. | 05-21-2015 |
20150138067 | METHOD OF TRANSMITTING AND RECEIVING DATA, AND DISPLAY APPARATUS AND POINTING APPARATUS USING THE SAME - Provided is a data transmitting and receiving method in which a display apparatus connectable to a wireless network transmits and receives data to and from a user terminal by using a pointing apparatus. The data transmitting and receiving method includes (a) acquiring position information of a region, pointed by the pointing apparatus, in the display apparatus and (b) transmitting data, corresponding to an object displayed in the region on a screen of the display apparatus, to the user terminal by using the pointing apparatus. | 05-21-2015 |
20150138068 | Methods, Controllers and Computer Program Products for Accessibility to Computing Devices - Methods of providing user accessibility to an electronic device are provided. Methods include receiving a physical input via at least one user input device in a user interface, generating, in the user interface, a sensor output signal responsive to receiving the physical input from the user, and interpreting the sensor output signal as a gesture input signal that that is received by the electronic device. The gesture input signal is operative to cause the electronic device to perform a function corresponding to a gesture that is physically applied to the electronic device. | 05-21-2015 |
20150138069 | METHODS, SYSTEMS, AND COMPUTER READABLE MEDIA FOR UNIFIED SCENE ACQUISITION AND POSE TRACKING IN A WEARABLE DISPLAY - Methods, systems, and computer readable media for unified scene acquisition and pose tracking in a wearable display are disclosed. According to one aspect, a system for unified scene acquisition and pose tracking in a wearable display includes a wearable frame configured to be worn by a user. Mounted on the frame are: at least one sensor for acquiring scene information for a real scene proximate to the user, the scene information including images and depth information; a pose tracker for estimating the user's head pose based on the acquired scene information; a rendering unit for generating a virtual reality (VR) image based on the acquired scene information and estimated head pose; and at least one display for displaying to the user a combination of the generated VR image and the scene proximate to the user. | 05-21-2015 |
20150138070 | HEAD-MOUNTED DISPLAY - A head-mounted display includes a mounting unit, a display unit, a sensor unit, and a control unit. The mounting unit is configured to be mountable on a head of a user. The display unit is provided in the mounting unit and capable of providing the user with a field of view of a real space. The sensor unit is provided in a periphery of the display unit, includes a sensor that is capable of detecting a hand of the user that is brought close to or in contact with the sensor due to an action of the user, and performs output based on a detection result of the sensor. The control unit displays an image in the field of view and changes a display state of the image, based on a change in output of the sensor unit. With this configuration, the head-mounted display enables an intuitive operation to be performed using a natural action of the user. | 05-21-2015 |
20150138071 | PROJECTOR AND METHOD OF CONTROLLING PROJECTOR - A projector includes a manipulation detecting unit that performs a manipulation detecting process on the basis of captured image data of an imaging unit, and a correction control unit that performs a distortion correcting process on the basis of the captured image data. An imaging control unit sets an image-capturing resolution of the imaging unit to resolutions different from each other between a case where the manipulation detecting unit performs the manipulation detection process and a case where the correction control unit performs the distortion correcting process. | 05-21-2015 |
20150138072 | DATA PROCESSOR - A novel human interface excellent in operability is provided. Furthermore, a novel data processor excellent in operability is provided. Furthermore, a novel data processor, a novel display device, or the like is provided. An input/output device that receives image data and supplies positional data, and an arithmetic device that supplies the image data and receives the positional data are included. The input/output device includes a first region, a second region, and a bend portion between the first region and the second region. Each of the first region and the second region includes a display portion and a positional data input portion that overlaps the display portion. The arithmetic device includes an arithmetic unit and a storage unit that stores a program to be executed by the arithmetic unit. | 05-21-2015 |
20150138073 | Text Selection Using HMD Head-Tracker and Voice-Command - A joint head tracker and voice command in a headset computer enables hands-free user text selection. The method and system enables an end-user to select sections of text without requiring use of a mouse cursor control input device. Embodiments include a headset computer, including a processor, configured to display text in a screen view. The computer further configured to place a first point within the displayed text according to one or more of first head tracking information, first voice commands and first gestures, and place a second point within the displayed text according to second head tracking information, second voice commands and second gestures. The computer further configured to select a text section between the first point and the second point in response to one or more of third head tracking information, third voice commands and third gestures, and display the selected text section as a highlighted text section. | 05-21-2015 |
20150138074 | Head Tracking Based Gesture Control Techniques for Head Mounted Displays - A head gesture-based recognition system in Headset Computers (HSC) is disclosed. Notification dialogue boxes can be acknowledged by head nodding or ticking movement in the user interface. Question dialog boxes can be assured by head nods or head shakes in the user interface. Head swiping is also a recognizable form of user input through a head tracker of the HSC. Progress indicators and other visual display feedback are utilized. | 05-21-2015 |
20150138075 | RECOGNITION DEVICE, RECOGNITION METHOD, COMPUTER PROGRAM PRODUCT, AND TERMINAL DEVICE - According to an embodiment, a recognition device includes an obtaining unit, a selection action recognizing unit, and an output unit. The obtaining unit is configured to obtain a measured value according to an action of a target for measurement. The measured value is measured by a measuring device attached to a specific body part of the target for measurement. The selection action recognizing unit is configured to, based on an acceleration of the specific body part as obtained from the measured value, recognize that a selection action has been performed for selecting any one target for operations included in a screen area. The output unit is configured to, in a state in which the selection action has been performed, output information about an operation state of the target for operations based on an amount of change in a tilt of the specific body part as obtained from the measured value. | 05-21-2015 |
20150138076 | COMMUNICATION DEVICE AND METHOD OF PROCESSING INCOMING CALL BY FACIAL IMAGE - A communication device able to manage incoming calls by reference to a user's facial expression includes a storage unit, an image capturing unit, an image recognition unit, and an incoming call processing unit. The storage unit stores preset facial images associated with predetermined processing operations. The image capturing unit capturing images of the user's face when an incoming call is received and the image recognition unit matches captured images against preset facial images. When at least one of the captured images matches one of the at least one preset facial images, the incoming call processing unit executes a predetermined processing operation. | 05-21-2015 |
20150138077 | DISPLAY SYSTEM AND DISPLAY CONTROLL DEVICE - According to an embodiment, a display system includes an operation input receiver, a recognizer, an execution controller, and a display controller. The operation input receiver receives an operation input. The recognizer recognizes one or more information processing terminals placed on the operation input receiver. The execution controller executes a command based on the operation input that is performed through the operation input receiver. The display controller displays, on the operation input receiver, an execution result of the command so as to be associated with at least any of the one or more information processing terminals. | 05-21-2015 |
20150138078 | HAND POSE RECOGNITION USING BOOSTED LOOK UP TABLES - Pose and gesture detection and classification of a human poses and gestures using a discriminative ferns ensemble classifier is provided. Sample image data in one or more channels includes a human image. A processing device operates on the sample image data using the discriminative ferns ensemble classifier. The classifier has set of classification tables and matching bit features (ferns) which are developed using a first set of training data and optimized by a weighting of the tables using an SVM linear classifier configured based on the first or a second set of pose training data. The tables allow computation of a score per pose class for the image in the sample data and the processor outputs a determination of the pose in the sample depth image data. The determination enables the manipulation of a natural user interface. | 05-21-2015 |
20150138079 | COMPONENT DETERMINATION AND GAZE PROVOKED INTERACTION - According to the invention, a method for changing a display based at least in part on a gaze point of a user on the display is disclosed. The method may include receiving information identifying a location of the gaze point of the user on the display. The method may also include, based at least in part on the location of the gaze point, causing a virtual camera perspective to change, thereby causing content on the display associated with the virtual camera to change. | 05-21-2015 |
20150138080 | ADJUSTING MEDIA DISPLAY IN A PERSONAL DISPLAY SYSTEM BASED ON PERSPECTIVE - A personal display system with which a user may adjust the configuration of displayed media is provided. The personal display system may include an electronic device operative to provide media to a personal display device operative to display the received media. Using one or more optical and digital components, the personal display device may adjust displayed media to overlay features of a theater, thus giving the user of the personal display device the impression of being in the theater. In some embodiments, the personal display device may receive a user selection of a seat in the theater from which to watch the media, and may adjust the media display accordingly. In some embodiments, the personal display device may detect the user's movements using one or more sensors and may adjust the displayed image based on the user's movements. For example, the device may detect a user's head movement and cause the portion of media displayed to reflect the head movement. | 05-21-2015 |
20150145760 | WEARABLE DEVICE FOR WRITING AND DRAWING - An embodiment provides a method, including: detecting, via a wearable information handling device, a user motion; processing, via the wearable information handling device, the user motion into user motion data; determining, using a processor, that the user motion data is handwriting input; and converting, using the processor, the user motion data into a digital handwriting input. Other aspects are described and claimed. | 05-28-2015 |
20150145761 | INFORMATION DISPLAY SYSTEM AUTOMATICALLY ADJUSTING PROJECTION AREA AND METHOD THEREOF - The present invention discloses an information display system automatically adjusting the projection area and a method thereof. The method comprises steps: using a detection module to capture images of a driver and measure a distance between the detection module and the driver; recognize facial features of a driver (including a position of the driver's eye, a rotation angle of the driver's face, and a direction the driver fixates) to estimate a required displacement of the projection area on the windshield (i.e. the region inside the visual field of the driver); | 05-28-2015 |
20150145762 | POSITION-OF-INTEREST DETECTION DEVICE, POSITION-OF-INTEREST DETECTION METHOD, AND POSITION-OF-INTEREST DETECTION PROGRAM - A distance calculation unit acquires a first position information indicating a position of a part of a body of each user indicated in an image captured by an imaging device. A user information analysis unit detects an operation region where a user can operate by using a part of a body, based on the first position information, and that identifies as an operator candidate a user having the part of the body indicated by the first position information in case that the detection of the operation region was possible. The user information analysis unit, of operator candidates, identifies as an operator a user including a part of the body in the operation region, and identifies as a user who cannot perform operation a user other than a user identified as the operator of the operator candidates. Because of the above, it is possible to increase operability. | 05-28-2015 |
20150145763 | ELECTRONIC DEVICE - In order to improve the ease of use of an electronic device, the electronic device including a communication unit capable of communicating with a first device, and an input unit that inputs at least one of first information about a specification of the first device and second information about use of the first device by a user via the communication unit. | 05-28-2015 |
20150145764 | OPTICAL PROXIMITY DETECTORS - Described herein are optical proximity detectors, methods for use therewith, and systems including an optical proximity detector. Such optical proximity detectors include an analog front-end and a digital back-end. In certain embodiments, the digital back-end includes a dynamic gain and phase offset corrector, a cross-talk corrector, a phase and magnitude calculator, and a static phase offset corrector. The dynamic gain and phase offset corrector corrects for dynamic variations in gain and phase offset of the analog front-end due to changes in temperature and/or operating voltage levels. The crosstalk corrector corrects for electrical and/or optical crosstalk associated with the analog front-end. The phase and magnitude calculator calculates phase and magnitude values in dependence on the corrected versions of digital in-phase and quadrature-phase signals received from the analog front-end. The static phase offset corrector corrects for a static phase offset of the optical proximity detector. | 05-28-2015 |
20150145765 | POSITIONING METHOD AND APPARATUS - The present invention discloses a positioning method and apparatus, which relate to the field of computer technologies and may simplify an operation of selecting a target application icon or function button and makes the operation less time-consuming. According to embodiments of the present invention, a gazed area on a screen under the gaze of a user is acquired by detecting a screen gazing operation of the user, a projection area on the screen to be pressed by the user is acquired by detecting the user's hover gesture operation of pressing the screen; when the gazed area does not include the projection area, the user's hover gesture operation of pressing the screen is re-detected to acquire the projection area on the screen to be pressed by the user; when the gazed area includes the projection area, an application icon or a function button included in a match area is processed. | 05-28-2015 |
20150145766 | IMAGE DISPLAY APPARATUS AND METHOD OF CONTROLLING IMAGE DISPLAY APPARATUS - A projector includes a projection unit, an external I/F, an image I/F, a connection detecting unit, a manipulation detecting unit, and an instruction information generation unit. The connection detecting unit detects connection in the external I/F and the image I/F. The manipulation detecting unit causes the projection unit to project a detecting image when connection is detected and detects a manipulation performed with respect to the detecting image. The instruction information generation unit executes processing which is based on the detected manipulation. | 05-28-2015 |
20150145767 | ELECTRONIC DEVICE AND DISPLAY METHOD - There is provided an electronic device including a first display and a second display, including: a display controller configured to display a first image on the second display and display a second image and the first image superimposed on the second image on the first display; an input receiver configured to receive an operation input with respect to the first image displayed on the first display; and a processor configured to execute a first processing routine based on the operation input received by the input receiver. | 05-28-2015 |
20150145768 | METHOD AND DEVICE FOR CONTROLLING AN APPARATUS USING SEVERAL DISTANCE SENSORS - A method for controlling an apparatus, includes steps of: determining distance measurements of an object in a first direction, using distance sensors defining between them a second direction different from the first direction, assessing a first inclination of the object in relation to a second direction based on the distance measurements, and determining a first command of the apparatus according to the inclination assessment. | 05-28-2015 |
20150145769 | POWER MANAGEMENT IN AN EYE-TRACKING SYSTEM - An imaging device adapted to provide eye-tracking data by imaging at least one eye of a viewer, wherein: the imaging device is switchable between at least an active mode and a ready mode; the imaging device is configured to use active eye illumination in the active mode, which enables tracking of a corneal reflection; and the imaging device is configured, in the ready mode, to reduce an illumination intensity from a value the illumination intensity has in the active mode, and to provide eye-tracking data which include eye position but not eye orientation. | 05-28-2015 |
20150145770 | INFORMATION PROCESSING DEVICE, PORTABLE DEVICE AND INFORMATION PROCESSING SYSTEM - To take security into account and increase user friendliness, an information processing device includes: an input unit to which information is input; an extracting unit extracting predetermined words from the information input to the input unit; a classifying unit classifying the words extracted by the extracting unit into first words and second words; and a converting unit converting the first words by a first conversion method and converting the second words by a second conversion method, the second conversion method being different from the first conversion method. | 05-28-2015 |
20150145771 | Transparent Display Field of View Region Determination - A method and system for determining a field of view region on a transparent display is provided. The method includes receiving from a user facing device, a user image. User image key features of the user image are identified and user image attributes of said image key features are analyzed. An object image of objects is received and a first object is identified. Object key features of the first object are identified and object attributes of object key features are analyzed. A specified position on a transparent display for displaying a first image associated with the first object is determined. The first image is displayed at the specified position on the transparent display. | 05-28-2015 |
20150290031 | TOUCHLESS USER INTERFACE FOR OPHTHALMIC DEVICES - An ophthalmic apparatus for laser eye surgery comprising a command recognition unit configured for detecting and recognizing a gesture command and/or voice command of an operator of the ophthalmic apparatus, at least one controlled unit configured for receiving a control signal and configured for changing a state based on the received control signal, and a controller configured for generating a control signal and transmitting the control signal to the at least one controlled unit based on the recognized gesture command and/or voice command. | 10-15-2015 |
20150293585 | SYSTEM AND METHOD FOR CONTROLLING HEADS UP DISPLAY FOR VEHICLE - A technique for controlling a heads up display for a vehicle capable of directly controlling contents using gaze information received from a camera for gaze tracking and coordinates of a recognizable specific object such as a hand. The technique for controlling a head up display for a vehicle includes: tracking a driver's gaze using a camera; when the gaze stares at the HUD, detecting a gaze vector based on the gaze tracking; detecting the gaze vector and then detecting a hand between the camera and the driver's gaze and tracking coordinates of a tip of the hand; matching a final coordinate of the driver's gaze staring at the HUD and a final coordinate of the tip of the hand; and controlling the HUD using the driver's hand. | 10-15-2015 |
20150293593 | IDENTIFYING MOVEMENTS USING A MOTION SENSING DEVICE COUPLED WITH AN ASSOCIATIVE MEMORY - A kinematic system. The system includes a kinematic measurement device having one or more sensors configured to detect a plurality of physical positions of a part of an object. The system also includes an associative memory, in communication with the kinematic measurement device, and comprising a plurality of data and a plurality of associations among the plurality of data, wherein the plurality of data is collected into associated groups, wherein the associative memory is configured to be queried based on at least indirect relationships among the plurality of data. The system also includes a processor, in communication with the associative memory and the kinematic measurement device, and configured to translate a range of coordinate positions of the part of the object to a qualitative description that names the range, wherein the processor is further configured to provide the qualitative description to the associative memory for storage. | 10-15-2015 |
20150293595 | IMAGE DISPLAY DEVICE AND METHOD FOR CONTROLLING SAME - The present invention provides an image display device comprising: a sensing unit for sensing an input gesture of a user; a display unit for outputting visual information among the executed data of an application when the application is executed; a collection unit for collecting control gesture information included in the executed data; and a control unit for executing an event of the application which is included in the executed data and corresponds to the control gesture information if the control gesture information and the input gesture sensed by the sensing unit are matching while the application is executed. | 10-15-2015 |
20150293598 | METHOD FOR PROCESSING INFORMATION AND ELECTRONIC DEVICE - A method for processing information and an electronic device is provided. The method is applied to a first electronic device having a first sensing unit. The method includes: detecting, by the first electronic device, a first operation of a user via the first sensing unit, and obtaining a first parameter characterizing the first operation; receiving a second parameter, characterizing the first operation, sent by a second electronic device, wherein the second parameter is obtained by a second sensing unit of the second electronic device detecting the first operation, wherein a detection dimension of the first operation detected by the first electronic device is different from that of the first operation detected by the second electronic device; determining and generating a first instruction based on the first parameter and the second parameter; and executing the first instruction. | 10-15-2015 |
20150293600 | DEPTH-BASED ANALYSIS OF PHYSICAL WORKSPACES - Systems and methods are provided for dynamically performing depth-based analysis of a physical workspace. The system includes a depth camera to generate three dimensional (3D) images, and a controller. The controller is able to acquire a stream of 3D images, calculate distances between the depth camera and objects represented by 3D pixels within the 3D images, identify an increase in distance between the objects and the depth camera, detect a pause, and define a reference surface during the pause. The controller is also able to identify a change in distance between the objects and the depth camera after defining the reference surface, to identify a segment of the current 3D image close to the depth camera, and to determine a gesture location within the current 3D image based on the identified segment. A data set corresponding to the gesture location then adjusts an output of a display. | 10-15-2015 |
20150293610 | OPTICAL NAVIGATION DEVICE AND FAILURE IDENTIFICATION METHOD THEREOF - There is provided a failure identification method of an optical navigation device including the steps of: constructing a fixed noise map according to image frames captured by an image sensor; calculating a feature value of the fixed noise map; identifying whether the fixed noise map is uniform or not according to the feature value; and generating an alert signal when the fixed noise map is non-uniform for indicating failure of the optical navigation device. | 10-15-2015 |
20150293611 | DISPLAY MODULE WITH INTEGRATED PROXIMITY SENSOR - A display module includes an electronic display screen and a sensor operable to detect radiation reflected to a surface of the electronic display screen. A method of detecting an object is also disclosed. | 10-15-2015 |
20150301574 | WEARABLE DEVICE, MASTER DEVICE OPERATING WITH THE WEARABLE DEVICE, AND CONTROL METHOD FOR WEARABLE DEVICE - Provided is a wearable device that is wearable on a body of a user, the wearable device including a display configured to display a screen; a sensor configured to sense a motion of the user; and a controller configured to control the display based on the sensed motion. | 10-22-2015 |
20150301590 | SYSTEM AND METHOD FOR PRODUCING COMPUTER CONTROL SIGNALS FROM BREATH ATTRIBUTES - A method for computing output using a non-contact (invisible) input signal includes acquiring depth data of a scene captured by a depth-capable sensor. The method includes generating a temporal series of depth maps corresponding to the depth data. The method includes generating at least one volumetric attribute from the depth data. The method includes generating an output based on the volumetric attribute to control actions. | 10-22-2015 |
20150301591 | METHOD FOR INPUTTING A CONTROL COMMAND FOR A COMPONENT OF A MOTOR VEHICLE - A method inputs a control command for a component of a motor vehicle. The method involves generating an image sequence of an input object guided by a user in a specified detection region using an imaging device, detecting a change in position of the input object on the basis of the image sequence, and generating a control command for the component of the motor vehicle on the basis of the detected change in position. The imaging device employs at least one infrared-sensitive camera, and the detection region is illuminated using at least one infrared source. | 10-22-2015 |
20150301592 | UTILIZING TOTEMS FOR AUGMENTED OR VIRTUAL REALITY SYSTEMS - An augmented reality display system comprises a passable world model data comprises a set of map points corresponding to one or more objects of the real world. The augmented reality system also comprises a processor to communicate with one or more individual augmented reality display systems to pass a portion of the passable world model data to the one or more individual augmented reality display systems, wherein the piece of the passable world model data is passed based at least in part on respective locations corresponding to the one or more individual augmented reality display systems. | 10-22-2015 |
20150301593 | EYE IMAGING IN HEAD WORN COMPUTING - Aspects of the present invention relate to methods and systems for imaging, recognizing, and tracking of a user's eye that is wearing a HWC. Aspects further relate to the processing of images reflected from the user's eye and controlling displayed content in accordance therewith. Aspects further relate to determining health conditions of the user based on eye imaging technologies. | 10-22-2015 |
20150301594 | IMAGE DISPLAY DEVICE AND INFORMATION INPUT DEVICE - According to an illustrative embodiment, an information processing device is provided. The information processing device includes at least one electrode configured to generate at least one signal related to eye blinking of a subject; and a processing circuit configured to receive the at least one signal and detect eye blinking of the subject based on the at least one signal. | 10-22-2015 |
20150301597 | CALCULATION OF AN ANALYTICAL TRAIL IN BEHAVIORAL RESEARCH - Exemplary embodiments provide methods, mediums, and systems for behavioral research. In some embodiments, a simulated environment may be created and displayed. A user may interact with the simulated environment by directing the user's gaze towards different objects in the simulated environment. One or more gaze fields may be calculated to determine which objects the user is viewing. A score may be calculated for the objects in the simulated environment, and the score may be used to display an analytical trail. The score may be dependent on both a first look at an object, in which the user first directs their gaze toward the object, and one or more second looks at the object, in which the user looks away from the object and then returns their gaze to the object. In determining the score, the second looks may be given more weight than the first look. | 10-22-2015 |
20150301599 | EYE TRACKING SYSTEMS AND METHOD FOR AUGMENTED OR VIRTUAL REALITY - An augmented reality display system comprises a passable world model data comprises a set of map points corresponding to one or more objects of the real world. The augmented reality system also comprises a processor to communicate with one or more individual augmented reality display systems to pass a portion of the passable world model data to the one or more individual augmented reality display systems, wherein the piece of the passable world model data is passed based at least in part on respective locations corresponding to the one or more individual augmented reality display systems. | 10-22-2015 |
20150301601 | OPERATING DEVICE FOR CONTROLLING A MEDICAL APPARATUS - An operating device ( | 10-22-2015 |
20150301602 | Multi Axis Vibration Unit In Device For Tactile Feedback - A device receives a notification. The notification comprises a first response or a second response. Based on the notification being a first response, the device vibrates along a first direction. Based on the notification being a second response, the device vibrates along a second direction. The device comprises a first motor and a second motor. The first motor is capable of vibrating the device along a first axis. The second motor is capable of moving the device along a second axis. | 10-22-2015 |
20150301606 | TECHNIQUES FOR IMPROVED WEARABLE COMPUTING DEVICE GESTURE BASED INTERACTIONS - Techniques for improved wearable computing device gesture based interactions are described. For example, an apparatus may comprise a band comprising one or more sensors arranged around a circumference of the band to monitor muscle activity and logic, at least a portion of which is in hardware, the logic to detect changes in muscle activity based on signals received from one or more of the one or more sensors and to interpret the detected changes in muscle activity as one or more gestures to control the apparatus. Other embodiments are described and claimed. | 10-22-2015 |
20150301608 | METHOD OF PROVIDING USER INTERACTION WITH A WEARABLE DEVICE AND WEARABLE DEVICE THEREOF - A wearable device and a method of providing user interaction with the wearable device is provided. The method includes receiving one or more signals from at least one of one or more pressure sensors and one or more vibration sensors, obtaining information related to at least one of an orientation of the wearable device on a user hand and a hand on which the wearable device being worn based on the one or more signals received, and performing one or more functions based on the obtained information. | 10-22-2015 |
20150301611 | OBJECT AND MOVEMENT DETECTION - Motions, positions or configurations off, for example a human hand can be recognised by transmitting a plurality of transmit signals in respective time frames; receiving a plurality of receive signals; determining a plurality of channel impulse responses using the transmit and receive signals; defining a matrix of impulse responses, with impulse responses for adjacent time frames adjacent each other; and analysing the matrix for patterns corresponding to the motion position or configuration. | 10-22-2015 |
20150301612 | IMAGE PROCESSING DEVICE AND IMAGE DISPLAY DEVICE - The input device and the input system include a hand detection means which detects the position of the hand, a body part detection means which detects positions of user's body parts such as the face, for example, a relative position calculation means which calculates a relative position of the hand with respect to body parts from hand position information being a detection result of the hand detection means and body part position information being a detection result of the body part detection means, and a gesture recognition means which recognizes a hand gesture on the basis of a change in the hand position information being the detection result of the hand detection means, wherein when a hand gesture is recognized, the operation of the input device with respect to the user's gesture is changed according to a relative position of the hand with respect to body parts. | 10-22-2015 |
20150301615 | IMPACT AND CONTACTLESS GESTURE INPUTS FOR DOCKING STATIONS - A docking station configured to mate to an electronic device enables methods of interacting with the electronic device by impacting (e.g., knocking) on a table on which the device and/or the docking station are disposed and by means of contactless gestures. The electronic device may remain in a powered off state while the docking station continuously monitors for user input. The docking station may have a processor that is capable of detecting a user's impact and contactless gesture inputs. | 10-22-2015 |
20150301619 | WEARABLE WIRELESS TONGUE CONTROLLED DEVICES - A wearable device and a system to provide an input for a computing device are disclosed. The device comprises a sensing unit to deliver infrared signals to the facial region of a user and to receive transmitted or reflected signals therefrom, and a processing unit to determine the position or movement of the tongue of the user based on the received infrared signals. The processing unit is configured to provide an input to a computing device based on the determined position or movement of the tongue. The system further comprises a transmitter for wirelessly transmitting the input from the processing unit to the computing device. | 10-22-2015 |
20150301629 | Method and Apparatus for Presenting Information On a Mouse Pad - A system that incorporates the subject disclosure may include, for example, a method that determines a pattern for a mouse pad, wherein the mouse pad comprises an electronic paper display and a processor and transmits the pattern to the mouse pad to enable the processor to present the pattern on the electronic paper display. Additional embodiments are disclosed. | 10-22-2015 |
20150301672 | DISPLAY APPARATUS AND METHOD OF CONTROLLING THE SAME - A flexible display apparatus includes: a flexible display panel which operates in a plane mode or a curved mode; a sensor unit which measures a direction, a location or a degree of a curvature of the flexible display panel; and a brightness control unit which controls brightness of an area of the flexible display panel based on the direction, the location or the degree of the curvature of the flexible display panel. | 10-22-2015 |
20150301677 | Device management system and method - Provided is a device management system and method. The system includes an information collection module, a core processing module and a management module; the information collection module detects in real time one or more operations of a user, generates operation information of the user based on detected operations of the user, and transmits the operation information of the user to the core processing module; the core processing module parses and concludes the operation information of the user as small models, combines certain small modules to a big model representing a habitual behavior of the user and stores the big model; when a current location and time match those stored in the device management system, the management module manages devices to which the habitual behavior is applied to automatically reproduce the habitual behavior. | 10-22-2015 |
20150301797 | SYSTEMS AND METHODS FOR RENDERING USER INTERFACES FOR AUGMENTED OR VIRTUAL REALITY - An augmented reality display system comprises a passable world model data comprises a set of map points corresponding to one or more objects of the real world. The augmented reality system also comprises a processor to communicate with one or more individual augmented reality display systems to pass a portion of the passable world model data to the one or more individual augmented reality display systems, wherein the piece of the passable world model data is passed based at least in part on respective locations corresponding to the one or more individual augmented reality display systems. | 10-22-2015 |
20150309561 | STRENGTHENING PREDICTION CONFIDENCE AND COMMAND PRIORITY USING NATURAL USER INTERFACE (NUI) INPUTS - An embodiment provides a method, including: receiving, at a device having at least one input device, a plurality of user inputs within a predetermined time; determining, using a processor of the device, a collective intent based on the plurality of user inputs; said determining comprising mapping at least two of the plurality of user inputs to a common command; and committing, using a processor of the device, an action according to the common command. Other aspects are described and claimed. | 10-29-2015 |
20150309566 | GAZE TRACKING SYSTEM - A gaze tracking system is provided with a user interface that is configured to display content and a camera that is configured to provide a signal indicative of an image of a user viewing the content. The gaze tracking system also includes a controller that communicates with the user interface and the camera. The controller is configured to determine a user interest level in the content displayed on the user interface based on movement of the user over time and to provide updated content to the user interface based on the user interest level. | 10-29-2015 |
20150309569 | USER INTERFACE CONTROL USING GAZE TRACKING - Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for identifying a direction in which a user is looking. In one aspect, a method includes receiving an image of a sequence of images. The image can depict a face of a user. A template image for each particular facial feature point can be compared to one or more image portions of the image. The template image for the particular facial feature point can include a portion of a previous image of the sequence of images that depicted the facial feature point. Based on the comparison, a matching image portion of the image that matches the template image for the particular facial feature point is identified. A location of the matching image portion is identified in the image. A direction in which the user is looking is determined based on the identified location for each template image. | 10-29-2015 |
20150309579 | LOW-LATENCY GESTURE DETECTION - Low-latency gesture detection is described, for example, to compute a gesture class from a live stream of image frames of a user making a gesture, for example, as part of a natural user interface controlling a game system or other system. In examples, machine learning components are trained to learn gesture primitives and at test time, are able to detect gestures using the learned primitives, in a fast, accurate manner. For example, a gesture primitive is a latent (unobserved) variable describing features of a subset of frames from a sequence of frames depicting a gesture. For example, the subset of frames has many fewer frames than a sequence of frames depicting a complete gesture. In various examples gesture primitives are learnt from instance level features computed by aggregating frame level features to capture temporal structure. In examples frame level features comprise body position and body part articulation state features. | 10-29-2015 |
20150309580 | METHOD AND COMPUTING UNIT FOR FACILITATING INTERACTIONS OF A GROUP OF USERS WITH GESTURE-BASED APPLICATION - Embodiments of the present disclosure provide a method for facilitating interactions, with a gesture-based application, of a group of users. The method comprises identifying, by a computing unit of an interactive device, the group based on information received from sensors associated with the interactive device. Then, an interaction intensity value associated with each of the users is determined. The interaction intensity value is indicative of the level of activity of the each of the users. Next, at least one active user among the group of users is identified based on an order of the interaction intensity values. Lastly, gestures of the at least one active user towards the gesture-based application are tracked for facilitating interactions with the gesture-based application. | 10-29-2015 |
20150309581 | CROSS-USER HAND TRACKING AND SHAPE RECOGNITION USER INTERFACE - Embodiments include vision-based interfaces performing hand or object tracking and shape recognition. The vision-based interface receives data from a sensor, and the data corresponds to an object detected by the sensor. The interface generates images from each frame of the data, and the images represent numerous resolutions. The interface detects blobs in the images and tracks the object by associating the blobs with tracks of the object. The interface detects a pose of the object by classifying each blob as corresponding to one of a number of object shapes. The interface controls a gestural interface in response to the pose and the tracks. | 10-29-2015 |
20150309582 | GESTURE OPERATED WRIST MOUNTED CAMERA SYSTEM - A system and method for capturing media are disclosed. In a first aspect, the system comprises a wristband device that includes at least one sensor and a camera coupled to the wristband device. The camera is controlled by at least one gesture determined using the at least one sensor. In a second aspect, the method comprises providing a wristband device that includes at least one sensor, coupling a camera to the wristband device, determining at least one gesture using the at least one sensor, and controlling the camera by using the at least one gesture. | 10-29-2015 |
20150309583 | MOTION RECOGNIZING METHOD THROUGH MOTION PREDICTION - A motion recognition method is disclosed. The method includes the steps of: detecting a user's motion in real time by using a motion recognition sensor; predicting a motion pattern to be drawn by the detected motion by comparing the detected motion with pre-set pattern information, wherein the motion pattern includes information on a type of a figure to be drawn by the detected motion and an anticipated time of the detected motion to be completed; and executing a control function corresponding to the predicted type of figure at the time when the anticipated time lapses. | 10-29-2015 |
20150309585 | GESTURE DETECTION AND COMPACT REPRESENTATION THEREOF - Techniques are described that may be implemented with an electronic device to detect a gesture within a field of view of a sensor and generate a compact data representation of the detected gesture. In implementations, a sensor is configured to detect a gesture and provide a signal in response thereto. An estimator, which is in communication with the sensor, is configured to generate an elliptical representation of the gesture. Multiple coefficients for the compact representation of the gesture can be used to define the ellipse representing the gesture. | 10-29-2015 |
20150309681 | Depth-based mode switching for touchless gestural interfaces - Disclosed are techniques for detecting a gesture performed at a first distance and at a second distance. A first aspect of a target may be manipulated according to the first gesture at the first distance and a second aspect of the target may be manipulated according to the first gesture at the second distance. | 10-29-2015 |
20150310670 | System and Method for Generating a Light Pattern for Object Illumination - A method and system for generating light pattern. The system may include: a light source providing a diverging light beam; a single lens element having first surface with a positive optical power in at least one cross section and a second surface. The second surface is configured to provide a multiplication function of the light beam in that cross section and a predefined intensity light distribution generator in a second cross section. | 10-29-2015 |
20150310790 | DISPLAY DEVICE - Variance in luminance deterioration in a display is suppressed. A display device in which a self-illuminating display element is used to display information comprises a display component in which are disposed a plurality of pixels made up of a plurality of colors of sub-pixels, and a controller for controlling the drive of the display component. The controller records an accumulated luminance for each sub-pixel, and calculates a luminance adjustment amount for each sub-pixel based on the difference between the accumulated luminance values. The controller also detects when no one is viewing the display device, produces a corrected image according to the luminance adjustment amount, and displays the corrected image on the display component while no one is viewing the display device. | 10-29-2015 |
20150316957 | DISPLAY PANEL, METHOD FOR DESIGNING DISPLAY PANEL AND RECORDING MEDIUM THEREOF - A method for designing a display panel. The method includes calculating, for each of a plurality of viewing locations, a difference value between a right side viewing angle of that viewing location and a left side viewing angle of that viewing location, the calculating is performed for each of a plurality of curvatures, and setting, from among the plurality of curvatures, a curvature of the curved display panel based on the difference values. Accordingly, a curved display panel which enlarges a viewing angle of a viewer than a flat display panel, increases an immersion, and decreases a distortion of a viewing angle may be realized. | 11-05-2015 |
20150316961 | INFORMATION PROCESSING METHOD AND DEFORMABLE ELECTRONIC DEVICE - The present disclosure discloses an information processing method and a deformable electronic device. The information processing method is applied in a deformable electronic device. The method comprises: acquiring a first deformation parameter corresponding to deformation of the electronic device when the first content is displayed on the display unit provided in the deformable electronic device; determining a control instruction based on the first deformation parameter; and displaying a second content on the display unit based on the control instruction, the first content being different from the second content. | 11-05-2015 |
20150316981 | GAZE CALIBRATION - Calibration of gaze tracking equipment is described, for example, in a desktop computing scenario. In various examples, an explicit calibration phase is carried out, optionally followed by an implicit calibration phase. In examples, the explicit calibration phase comprises requesting and receiving user manual input events associated with specified locations and measuring gaze associated with the manual input events. In examples, the implicit calibration phase is carried out without disturbing other activity of a user in the desktop computing environment, such as operating a graphical user interface. In various examples calibration data is stored in a plurality of buffers and used to control switching between explicit and implicit calibration phases. | 11-05-2015 |
20150316982 | UTILIZING PSEUDO-RANDOM PATTERNS FOR EYE TRACKING IN AUGMENTED OR VIRTUAL REALITY SYSTEMS - An augmented reality display system comprises a passable world model data comprises a set of map points corresponding to one or more objects of the real world. The augmented reality system also comprises a processor to communicate with one or more individual augmented reality display systems to pass a portion of the passable world model data to the one or more individual augmented reality display systems, wherein the piece of the passable world model data is passed based at least in part on respective locations corresponding to the one or more individual augmented reality display systems. | 11-05-2015 |
20150316984 | WEARABLE DEVICE AND METHOD OF OPERATING THE SAME - Provided are a wearable device and a method of operating the same. The wearable device includes: a display configured to display content; a user input unit configured to receive a command of a wearer; and a controller configured to: when the content is reproduced in a state in which the wearable device is worn, output, through the display and based on content reproduction related information of the wearable device, information informing the wearer that stopping reproduction of the content is requestable, and control to stop the reproduction of the content in response to a reproduction stop request for the content, received through the user input unit in response to the output information, wherein the content reproduction related information corresponds to information influential on a health of the wearer when the content is reproduced in the state in which the wearable device is worn. | 11-05-2015 |
20150316985 | Systems and Methods for Viewport-Based Augmented Reality Haptic Effects - One illustrative system disclosed herein includes a display configured to receive a display signal and output an image, and an image capture device configured to capture an area image and transmit an image signal. The illustrative system also includes a processor in communication with the image capture device and the display, the processor configured to: receive the image signal; determine a virtual object based in part on the image signal; determine the display signal based in part on the image signal, wherein the display signal includes data associated with the virtual object; determine a haptic effect based at least in part on the virtual object; and transmit a haptic signal associated with the haptic effect. The illustrative system further includes a haptic output device configured to receive the haptic signal and output the haptic effect. | 11-05-2015 |
20150316992 | OBJECT INFORMATION DERIVED FROM OBJECT IMAGES - Search terms are derived automatically from images captured by a camera equipped cell phone, PDA, or other image capturing device, submitted to a search engine to obtain information of interest, and at least a portion of the resulting information is transmitted back locally to, or nearby, the device that captured the image. | 11-05-2015 |
20150316993 | FAST FINGERTIP DETECTION FOR INITIALIZING A VISION-BASED HAND TRACKER - Systems and methods for initializing real-time, vision-based hand tracking systems are described. The systems and methods for initializing the vision-based hand tracking systems image a body and receive gesture data that is absolute three-space data of an instantaneous state of the body at a point in time and space, and at least one of determine an orientation of the body using an appendage of the body and track the body using at least one of the orientation and the gesture data. | 11-05-2015 |
20150316995 | ELECTRONIC DEVICE AND RECORDING MEDIUM - To provide an electronic device that displays an object or conducts a program in accordance with the movement of a display screen. The electronic device includes a display module that displays an object on a display screen; acceleration sensors that acquire data on the movement of the display screen; and a computing portion that performs a computation on the movement of the object. The display screen is provided over a flexible substrate, and the acceleration sensors are provided in corresponding regions into which the display screen is divided. Display is performed such that an object displayed in any one of the regions moves in accordance with data obtained by the acceleration sensor. | 11-05-2015 |
20150316996 | SYSTEMS AND METHODS FOR REMAPPING THREE-DIMENSIONAL GESTURES ONTO A FINITE-SIZE TWO-DIMENSIONAL SURFACE - A method for operating a real-time gesture based interactive system includes: obtaining a sequence of frames of data from an acquisition system; comparing successive frames of the data for portions that change between frames; determining whether any of the portions that changed are part of an interaction medium detected in the sequence of frames of data; defining a 3D interaction zone relative to an initial position of the part of the interaction medium detected in the sequence of frames of data; tracking a movement of the interaction medium to generate a plurality of 3D positions of the interaction medium; detecting movement of the interaction medium from inside to outside the 3D interaction zone at a boundary 3D position; shifting the 3D interaction zone relative to the boundary 3D position; computing a plurality of 2D positions based on the 3D positions; and supplying the 2D positions to control an application. | 11-05-2015 |
20150316997 | METHOD FOR CAPTURING AND TRANSMITTING MOTION DATA - In a method for capturing and transmitting motion data (s | 11-05-2015 |
20150316998 | TOUCHLESS HUMAN MACHINE INTERFACE - A system and method for receiving input from a user is provided. The system includes at least one camera configured to receive an image of a hand of the user and a controller configured to analyze the image and issue a command based on the analysis of the image. | 11-05-2015 |
20150317830 | ENDOSCOPIC SURGERY ASSISTING SYSTEM AND IMAGE CONTROL METHOD - An endoscopic system according to an embodiment of the present technology includes a head-mounted display, a detector, and a controller. The head-mounted display is worn by an operator. The detector is capable of detecting a motion of the operator. The controller causes each of the plurality of head-mounted displays to individually display an image. The controller includes an endoscopic image acquisition unit capable of obtaining endoscopic image data of an affected area of a patient and an image control unit capable of controlling the endoscopic image data based on an output from each of the plurality of detectors. The controller performs control to display the image based on an output from the image control unit. | 11-05-2015 |
20150323987 | Systems And Methods For Selectably Suppressing Computing Input Events Triggered By Variable Pressure And Variable Displacement Sensors - Systems and methods are disclosed herein that may be implemented to selectably suppress computing input events that are generated for an information handling system based on output signals received from a variable pressure or displacement (VPD) sensor that correspond to one or more pressure or displacement zones defined for the VPD sensor. Using the disclosed systems and methods, computing input events based on sensor output signals from one or more given VPD sensing zones may be selectably suppressed and/or withheld during sensor pressure or displacement changes from further host system processing according to a time delay, e.g., as a function of the elapsed time taken for a user to depress or release a given VPD sensor. | 11-12-2015 |
20150323989 | SYSTEM AND METHOD FOR COMMUNICATION - Computer based systems, methods, and computer readable media for sensing and decoding a sequence of facial expression exhibited by an individual sender, in order to determine a message encoded by that sequence of the individual. Digital image sensor devices may capture facial images of a sending individual. An enrollment processes is used by which an enrolled message is associated with a sequence of facial expressions. A background model is used with relevant expression models to locate a region of interest of the individual's face, which models accommodate the stand-off placement of the image sensors relative to the individual. An expression identifier is used with the expression model to classify captured images as including an enrolled facial expression. The classified images may then be decoded to identify the message sent. | 11-12-2015 |
20150323993 | SYSTEMS AND METHODS FOR PROVIDING HAPTIC FEEDBACK FOR REMOTE INTERACTIONS - A system includes a first electronic device and a second electronic device. The first electronic device includes a sensor configured to sense or property experienced by the first electronic device, and a transmitter configured to transmit a signal based on output from the sensor. The second electronic device is in signal communication with the first electronic device. The second electronic device includes a receiver configured to receive the transmitted signal, a detector configured to determine an object that a user of the second device is focusing on, a processor configured to generate a haptic signal representative of the transmitted signal if it is determined that the object the user is focusing on corresponds with a location of the first electronic device, and a haptic output device configured to receive the haptic signal and generate a haptic effect to the user. | 11-12-2015 |
20150323998 | ENHANCED USER INTERFACE FOR A WEARABLE ELECTRONIC DEVICE - Methods, systems and devices are provided for receiving input in a wearable electronic device from positioning an object near the wearable electronic device. Embodiments include an image sensor receiving an image. An input position of the object near the wearable electronic device may be determined with respect to a frame of reference. The determined input position may be one of a plurality of positions defined by a frame of reference and may be associated with an input value. A visual indication regarding the input value may be provided on a display of the wearable electronic device. At least one of an anatomical feature on the wearer and a received reference input on the anatomical surface may be used to determine the frame of reference. | 11-12-2015 |
20150323999 | INFORMATION INPUT DEVICE AND INFORMATION INPUT METHOD - An information input device has: a display part; an area setting part for setting input instruction areas for an operator to give input instructions; an obtainment part for obtaining a situation of the operator to give the input instructions; and a control part for distinctively arranging a selection area and a decision area in input instruction areas in response to motions of both hands of the operator determined based on information on the obtained situation. The selection area is related to a partial area of an entire display area of a display part and for receiving a selecting operation by the operator in the partial area, and the decision area is for receiving a deciding operation by the operator. | 11-12-2015 |
20150324000 | USER INPUT METHOD AND PORTABLE DEVICE - Provided are method and apparatus of defining at least a portion of a vicinity area of a portable device as an input area and controlling the portable device based on a user input provided on the input area, and the portable device enabling the method, wherein the portable device includes a sensing unit configured to sense a user input in a vicinity area of the portable device, a recognizer configured to recognize a user gesture corresponding to the user input, and an output unit configured to output a control instruction corresponding to the recognized user gesture to control the portable device. | 11-12-2015 |
20150324001 | SYSTEMS AND TECHNIQUES FOR USER INTERFACE CONTROL - Embodiments of systems and techniques for user interface (UI) control are disclosed herein. In some embodiments, a UI control system may determine locations of landmarks on a body of a user of a computing system, determine a pointer based at least in part on the landmark locations, and identify a UI element of an UI of the computing system based at least in part on the pointer. Other embodiments may be described and/or claimed. | 11-12-2015 |
20150324004 | ELECTRONIC DEVICE AND METHOD FOR RECOGNIZING GESTURE BY ELECTRONIC DEVICE - An electronic device and a method for recognizing a gesture by the electronic device are provided. The method includes sensing a change amount of a signal strength received through one or more channels by using a gesture sensor including the one or more channels, generating valid data according to the sensed change amount of the signal strength, recognizing a speed of the gesture according to the generated valid data, and determining the gesture according to the sensed change amount of the signal strength and the generated valid data. | 11-12-2015 |
20150324006 | DISPLAY CONTROL DEVICE - A direction detection portion is attached to a lever switch that is attached to a steering column. The direction detection portion detects in a discriminable manner operations of the lever switch in a rotation direction along an outer circumference of a steering, in a circumferential direction around the axis of the lever switch, in an axial direction of the lever switch and in a forward-backward direction of the steering. A display portion displays image information. A control portion controls display of the display portion and moves the image information displayed on the display portion in a direction according to an operation direction of the lever switch detected by the direction detection portion. | 11-12-2015 |
20150324014 | PORTABLE ELECTRONIC DEVICE AND ROTATION DETECTION METHOD THEREOF - A portable electronic device including a display device, an acceleration sensor and a processing unit is provided. The display device has a display surface. The acceleration sensor has a plurality of detecting sensor arrays for detecting a three-dimensional acceleration, and a shift angle smaller than 90 degrees is formed between at least one of the detecting sensor arrays and the display surface. The processing unit is configured to calculate a tilt angle according to the three-dimensional acceleration. | 11-12-2015 |
20150324022 | INPUT DEVICE - There is provided an information input device including an operator on which a user operates a sliding operation in a first direction, a first detection unit disposed at a rear surface of the operator detects a position and a pressure of the sliding operation operated by the user in the first direction, a second detection unit disposed adjacent to the first detection unit parallel to the first direction at the rear surface of the operator detects a position and a pressure of the sliding operation operated by the user in the first direction, a position measurement unit that measures an instructed position in the first direction based on the slide position detected by first detection unit or the second detection unit, and measures an instructed position in a second direction based on a difference of the pressures detected by the first detection unit and the second detection unit respectively. | 11-12-2015 |
20150324049 | SYSTEM AND METHOD FOR OPTIMIZING HAPTIC FEEDBACK - Methods, systems, computer-readable media, and apparatuses for adjusting the manner in which haptic feedback is provided to the user based on physical characteristics of the user. Physical characteristics may include stable physical characteristics that are non-changing with respect to a level of physical activity of the user. Examples of such stable physical characteristics may include age, gender, race, visual impairments and/or other physical characteristics. In some embodiments, the mobile device may adjust the haptic feedback by adjusting the intensity of the haptic feedback, frequency of the haptic feedback, duration for which the haptic feedback is provided to the user and changing a type of haptic feedback provided to the user of the device. | 11-12-2015 |
20150324051 | TOUCH BUTTON - The present invention relates to technical field of electronic switch control and discloses a touch button, which comprises a panel and a hollow pad module; wherein, the panel is provided with a button area, and the pad module is arranged at an edge of the button area of the panel; the touch button further comprises at least one sensor configured to sense a pressure, and the sensor is arranged inside the pad module and abuts against the panel. By providing such a touch button, the panel of the touch button can be made of any material, and thus the application range of the touch button is enlarged. Moreover, the touch button has good durability, is beautiful, can identify touch pressures accurately, has high sensitivity, and brings convenience into the actual use. | 11-12-2015 |
20150324168 | PORTABLE TERMINAL DEVICE AND INFORMATION PROCESSING SYSTEM - This invention is provided with: a memory unit for storing lip movement recognition data; an imaging unit for capturing a video including at least the lip portion of the operator; a lip movement recognition unit for performing a comparison between data representing the movement of the lip portion of the operator obtained from the imaging unit and the lip movement recognition data, and thereby recognizing the operation to execute; and a controller for performing an operation executable in accordance with the lip movement recognition unit. | 11-12-2015 |
20150324562 | USER AUTHENTICATION ON DISPLAY DEVICE - Embodiments are disclosed that relate to authenticating a user of a display device. For example, one disclosed embodiment includes displaying one or more virtual images on the display device, wherein the one or more virtual images include a set of augmented reality features. The method further includes identifying one or more movements of the user via data received from a sensor of the display device, and comparing the identified movements of the user to a predefined set of authentication information for the user that links user authentication to a predefined order of the augmented reality features. If the identified movements indicate that the user selected the augmented reality features in the predefined order, then the user is authenticated, and if the identified movements indicate that the user did not select the augmented reality features in the predefined order, then the user is not authenticated. | 11-12-2015 |
20150331454 | METHOD OF CONTROLLING OUTPUT OF SCREEN OF FLEXIBLE DISPLAY AND PORTABLE TERMINAL SUPPORTING THE SAME - A terminal supporting control of an output of a screen of a flexible display is provided. The terminal includes a flexible display configured such that at least a portion of the flexible display is deflected or folded, a sensor unit for collecting a sensor signal for detecting a location where the flexible display is deflected or folded, and a controller for controlling to determine a specific area of a functional screen or at least one functional screen from among a plurality of functional screens which are output before the flexible display is deflected or folded as a content viewing area when an input event is received, and for outputting the determined content viewing area in a specific display area of the folded flexible display. | 11-19-2015 |
20150331480 | IMAGING ARRANGEMENT FOR OBJECT MOTION DETECTION AND CHARACTERIZATION - A vision sensor includes a sensor assembly and a dedicated microprocessor. The sensor assembly includes a pixel array and a lens assembly that is optically coupled with the pixel array. The lens assembly has an F#<2, a total track length less than 4 mm, and a field of view of at least +/−20 degrees. The dedicated microprocessor is configured to perform computer vision processing computations based on image data received from the sensor assembly and includes an interface for a second processor. | 11-19-2015 |
20150331481 | METHOD AND SYSTEM FOR INTERACTION BETWEEN PROJECTOR AND CAMERA - A method and system for interaction between a projector and a camera, the method comprising: fixing the positions of a projector and a camera such that the projector and the camera are relatively still, the camera being an infrared camera; determining a first optical axis of the projector and a second optical axis of the camera; allocating a first lens at the intersection of the first optical axis and the second optical axis to process the first optical axis and the second optical axis, such that the first optical axis and the second optical axis are coincident after passing through the first lens. | 11-19-2015 |
20150331482 | DISPLAY DEVICE - According to one embodiment, a display device includes a display unit, an optical unit, and a reflector. The display unit includes a plurality of pixels arranged in a first plane. The display unit emits light including image information. At least a portion of the light emitted by the display unit is incident on the optical unit. The optical unit includes a first optical element. A travel direction of the at least the portion of the light is modified by the first optical element. The reflector reflects the at least the portion of the light modified by the first optical element. A perpendicular direction perpendicular to the first plane is non-parallel to an optical axis of the first optical element. | 11-19-2015 |
20150331483 | SYSTEM AND METHOD FOR SIMULATING A USER PRESENCE - A system and method for simulating a presence between a first user in a first pod at a first location and a second user in a second pod at a second location is provided. A first user suit is disposed in the first grid pod and a second user suit is disposed in the second grid pod. A first controller unit is electrically connected with the first pod and a second controller unit is electrically connected with the second pod. At least one input sensor is disposed in each of the first and second pods. Output devices are disposed in the first and second pods. A mainframe is electrically connected with the first and second controller units for receiving inputs from the input sensors and for providing instructions for activating the output devices to simulate the interaction of the first and second users with one another. | 11-19-2015 |
20150331484 | EYE TRACKING LASER POINTER - An embodiment provides a method, including: detecting, using a gaze tracking system of a device, a position of user gaze with respect to a display; sending, through a display interface, display information including an added visual element associated with the position of user gaze; tracking, using the gaze tracking system, change in the position of user gaze with respect to the display; and moving, using a processor, the added visual element responsive to change in the position of user gaze with respect to the display. Other aspects are described and claimed. | 11-19-2015 |
20150331485 | GAZE DETECTION CALIBRATION - Examples relating calibrating an estimated gaze location are disclosed. One example method comprises monitoring the estimated gaze location of a viewer using gaze tracking data from a gaze tracking system. Image data for display via a display device is received and, without using input from the viewer, at least one target visual that may attract a gaze of the viewer and a target location of the target visual are identified within the image data. The estimated gaze location of the viewer is compared with the target location of the target visual. An offset vector is calculated based on the estimated gaze location and the target location. The gaze tracking system is calibrated using the offset vector. | 11-19-2015 |
20150331486 | IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD AND PROGRAM - Provided is an image processing device including an eye-gaze direction detection unit configured to detect an eye-gaze direction of a user toward an image, an estimation unit configured to estimate a gaze area in the image on the basis of the eye-gaze direction and the image, the eye-gaze direction being detected by the eye-gaze direction detection unit, a chased object detection unit configured to detect a chased object being eye-chased by the user in the image, on the basis of the time-series gaze areas estimated by the estimation unit, a tracking unit configured to search for and track the chased object detected by the chased object detection unit, and an image control unit configured to control an image of the chased object tracked by the tracking unit. | 11-19-2015 |
20150331487 | INFOTAINMENT SYSTEM - Embodiments are disclosed for vehicle systems. An example system for a vehicle, includes
| 11-19-2015 |
20150331489 | COLLABORATIVE INTERACTIVE DEVICES - There is disclosed a method of provided a collaborative display, comprising: displaying an image on a first display associated with a first computing device; providing a second computing device having a second display; mapping the coordinates of the second display to the first display; in dependence on a current location of the second display relative to the first display, controlling the display of content on the second display. | 11-19-2015 |
20150331490 | VOICE RECOGNITION DEVICE, VOICE RECOGNITION METHOD, AND PROGRAM - By recognizing visual trigger events to determine start points and/or end points of voice data signals, the negative effects of noise on voice recognition may be significantly minimized. The visual trigger events may be predetermined gestures and/or predetermined postures of a user captured by a camera, which allow a system to appropriately focus attention on a user to optimize the receipt of a voice command in a noisy environment. This may be accomplished through the assistance of visual feedback complementing the voice feedback provided to the system by the user. Since the visual trigger events are predetermined gestures and/or postures, the system may be able to distinguish which sounds produced by a user are voice commands and which sounds produced by the user is noise that in unrelated to the operation of the system. | 11-19-2015 |
20150331492 | METHOD AND APPARATUS FOR IDENTIFYING SPATIAL GESTURE OF USER - Provided is a method and apparatus for identifying a spatial gesture of a user that may recognize a reference image of a user from a three-dimensional (3D) space in which a gesture of the user is performed, divide the 3D space into a plurality of partitioned spaces based on the reference image, and identifies the gesture of the user in the plurality of partitioned spaces, based on the reference image. | 11-19-2015 |
20150331493 | Wearable Input Device - A computer input method and device is disclosed. The method and device detect the bending motion of a user's fingers and provide the computer system of an electronic device with an immediate input associated with this bending motion. The electronic device can be a computer, tablet, mobile phone, optical head mounted computer display or the like. The present method and device are used to serve various gaming, entertainment, simulation and medical applications. | 11-19-2015 |
20150331496 | FLEXIBLE DISPLAY APPARATUS AND CONTROLLING METHOD THEREOF - A flexible display apparatus is provided. The flexible display apparatus includes a display configured to display content on a screen, a sensor configured to detect bending of the display from a first form to a second form, and a controller configured to reconstruct the content based on the bending and to display the reconstructed content in a first screen generated in one region of the display when it is determined that the display is restored to the first form. | 11-19-2015 |
20150331668 | NON-CONTACT GESTURE CONTROL METHOD, AND ELECTRONIC TERMINAL DEVICE - A method includes: receiving an A1 gesture motion of a user, where the A1 gesture motion is not in contact with the electronic terminal device; obtaining, an A1 control instruction corresponding to the A1 gesture motion, where the A1 control instruction is used to control the electronic terminal device; obtaining an A2 control instruction of the user within a preset delay period, where the preset delay period is less than three seconds and the A2 control instruction is used to control the electronic terminal device; and comparing the A1 control instruction with the A2 control instruction, where if the A1 control instruction is consistent with the A2 control instruction, the electronic terminal device does not perform an operation corresponding to the A2 control instruction; and if the A1 control instruction is inconsistent with the A2 control instruction, the electronic terminal device performs the operation corresponding to the A2 control instruction. | 11-19-2015 |
20150332075 | WEARABLE DEVICES FOR COURIER PROCESSING AND METHODS OF USE THEREOF - The disclosed embodiments include wearable devices and methods for performing courier services. In one implementation, the device includes a depth camera for detecting object depths in a field of view, a scanner for decoding visual codes, a speaker for producing audible sounds in response to an electrical signal, memory, and a processor. The processor may execute instructions to detect a scanning event based on a first signal received from the depth camera, determine a scan region associated with the scanning event, provide a second signal to the scanner causing the scanner to decode a visual code located within the scan region, generate scan data based on a third signal received from the scanner, and provide a fourth signal to the speaker causing the speaker to emit a notification sound. The wearable device may also capture signatures, dimension objects, and disable device functions based on time and place restrictions. | 11-19-2015 |
20150332085 | METHOD AND SYSTEM FOR ADAPTIVE ADJUSTMENT OF TERMINAL - The present invention relates to the technical field of smart household electric appliances and discloses a method and system for adaptive adjustment of a terminal. The method comprises the steps of: capturing in real time a current image of a user; evaluating based on the image a current use state of the user and a duration of the use state; performing an automatic control of the terminal based on the use state and its duration. According to the present invention, the user's current use state with respect to an electronic terminal is automatically detected through image identification, and based on the information on the state duration, the electronic terminal is controlled to perform a targeted adjustment, thereby achieving not only effective energy-saving, but also facilitation of user operation and enhancement of user experience. | 11-19-2015 |
20150338888 | FOLDABLE DEVICE AND METHOD OF CONTROLLING THE SAME - A foldable device includes: a flexible display configured to display an execution screen of an application; and a controller configured to control the flexible display to display an execution screen of at least one first application on a first surface of the flexible display that is used as a display of the foldable device when the flexible display is folded, and control, in response to the flexible display being unfolded, the flexible display to display the execution screen of the at least one first application and an execution screen of at least one second application related to the at least one first application to be displayed on a second surface of the flexible display that is used as a display of the foldable device when the flexible display is unfolded. | 11-26-2015 |
20150338917 | DEVICE, SYSTEM, AND METHOD OF CONTROLLING ELECTRONIC DEVICES VIA THOUGHT - A method of controlling an electronic device thought, includes: capturing through one or more electrodes, located in proximity to a brain of a user, signals of brainwave activity of said user; analyzing said signals to detect a pattern of brainwave activity of said user; based on the detected pattern, determining that the user thinks about a command that controls an electronic device; and based on said determining, triggering the electronic device to perform said command. | 11-26-2015 |
20150338918 | EVALUATION OF DIGITAL CONTENT USING INTENTIONAL USER FEEDBACK OBTAINED THROUGH HAPTIC INTERFACE - Systems and methods are provided for evaluating the quality of automatically composed digital content based on intentional user feedback obtained through a haptic interface. For example, a method includes accessing intentional user feedback collected by a haptic interface executing on a computing device, wherein the intentional user feedback provides an indication as to a user's reaction toward digital content that the user interacts with on the computing device. The digital content includes content that is automatically generated using content generation rules. The method further includes evaluating a quality of the digital content based on the intentional user feedback, and generating an evaluation report that includes information providing an evaluation of the quality of the digital content. | 11-26-2015 |
20150338919 | PROVIDING HAPTIC OUTPUT BASED ON A DETERMINED ORIENTATION OF AN ELECTRONIC DEVICE - Embodiments of the present disclosure provide a system and method for providing haptic output for an electronic device. In certain embodiments, a type of haptic output is provided based on a determined orientation, position, and/or operating environment of the electronic device. Specifically, the electronic device may receive input from one or more sensors associated with electronic device. Once the input from the one or more sensors is received, an orientation, position and/or operating environment of the electronic device is determined. Based on the determined orientation of the electronic device, a type of haptic output is selected and provided. | 11-26-2015 |
20150338924 | INFORMATION PROCESSING APPARATUS AND METHOD OF CONTROLLING THE SAME - A distance measuring unit measures a distance from a picked-up image by an image pickup unit to a tip portion of a finger. When the measured distance is less than a reference distance, a CPU displays an identification mark at a position corresponding to the finger on a display image, and takes in a locus of the finger as a handwritten character/figure. When the measured distance is the reference distance or more, the CPU hides the identification mark, and does not take in the locus of the finger as the handwritten character/figure. | 11-26-2015 |
20150338925 | CAUSING GESTURE RESPONSES ON CONNECTED DEVICES - A system and method to cause associated connected devices to perform gestures in response to user interactions with a connected device paired with the associated connected devices. The method includes receiving a gesture signal indicating a user interaction with a first connected device, identifying one or more connected devices associated with the first connected device, and, based on the user interaction, generating and transmitting a response signal to cause the one or more associated connected devices to perform a specified gesture signifying the user interaction with the first connected device. | 11-26-2015 |
20150338926 | WEARABLE DEVICE AND METHOD OF CONTROLLING THE SAME - Disclosed are a wearable device and a method of controlling the wearable device. The wearable device includes a sensor unit configured to detect at least one movement of the wearable device, and a control unit configured to determine a state of the wearable device based on the at least one detected movement, and to control, based on the determined state, the wearable device to operate in a first mode of performing a preset function of the wearable device or in a second mode of not performing the preset function. | 11-26-2015 |
20150338999 | MANAGING SENSORY INFORMATION OF A USER DEVICE - External mobile device sensors may be provided that are configured to manage sensory information associated with motion of objects external to the mobile device. In some examples, the object motion may be detected independent of contact with the device. In some examples, a device may include a screen with a first sensor (e.g., a touch sensor). The device may also include at least a second sensor external to the screen. Instructions may be executed by a processor of the device to at least determine when an object is hovering over a first graphical user interface (GUI) element of the screen. Additionally, in some cases, a second GUI element may be provided on the screen such that the second GUI element is rendered on the screen adjacent to a location under the hovering object. | 11-26-2015 |
20150339021 | CONTROL OF A HOST STATION THROUGH MOVEMENT OF A MOVING DEVICE - A method for controlling a host device for at least one moving device including the following steps: determining a current position of the at least one moving device relative to an interaction surface of the host station using at least one electrical signal induced by at least one inductive magnetic field in at least one electric circuit of the host device associated with the interaction surface; recording the current position in a position history; using the history to compare a change of the position of the at least one device with at least one reference change diagram, the reference change diagram being associated with at least one reference movement; and activating a command of the host station on the basis of the reference movement, if the change corresponds to the reference movement. | 11-26-2015 |
20150339100 | ACTION DETECTOR, METHOD FOR DETECTING ACTION, AND COMPUTER-READABLE RECORDING MEDIUM HAVING STORED THEREIN PROGRAM FOR DETECTING ACTION - A motion detector that detects an action of a limb includes a processor. The processor is configured to execute a process of extracting, as time-series data, a cepstrum coefficient of vibration generated by the action of the limb; generating time-division data by time-dividing the time-series data; and classifying a basic unit of the action corresponding to each of the time division data on the basis of the cepstrum coefficient included in the time-division data. | 11-26-2015 |
20150339805 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD - An imaging device includes a first camera and a second camera and shoots the same object under different shooting conditions. A shot-image data acquirer of an image analyzer acquires data of two images simultaneously shot from the imaging device. A correcting section aligns the distributions of the luminance value between the two images by carrying out correction for either one of the two images. A correction table managing section switches and generates a correction table to be used according to the function implemented by the information processing device. A correction table storage stores the correction table showing the correspondence relationship between the luminance values before and after correction. A depth image generator performs stereo matching by using the two images and generates a depth image. | 11-26-2015 |
20150342377 | Pillow for displaying imagery and playing associated audio - A patient comfort pillow has an outer surface that carries an image that is meaningful to a patient, and has an interior space carrying an acoustic driver that acoustically outputs an audio recording associated with the image and to which the patient can listen in response to physical handling directed at the image by the patient. The patient comfort pillow may include a pouch made of transparent material on the outer surface to hold a photograph bearing the image for viewing through the transparent material. Alternatively, the image may be printed on or sewn into the outer surface. The patient comfort pillow may include a microphone carried within the interior space to record the audio recording. Other enhancements are disclosed, such as an antenna to wirelessly receive the audio recording from another device. | 12-03-2015 |
20150346489 | LifeBoard - Series Of Home Pages For Head Mounted Displays (HMD) That Respond to Head Tracking - To assist with hands-free computing, the Head Mounted Display or Headset Computer utilizes a series of user configurable Home Pages that contain the shortcuts and widgets the user wants and one or more head movements. This allows the user to design a user interface environment which gives desired information, in the desired order. | 12-03-2015 |
20150346810 | Generating And Providing Immersive Experiences To Users Isolated From External Stimuli - An immersive experience system provides interactive content to a user while isolating the user from external stimuli in an isolation chamber. The user floats in a high-density suspension liquid maintained at the user's body temperature while consuming the interactive content which can provide audio, video and tactile inputs to the user. The user can interact with the immersive experience system via different modalities such as eye movements or gestures or via providing thought input through a visual cortex thought recorder comprised in the immersive experience system. | 12-03-2015 |
20150346811 | DISPLAY SYSTEM, DISPLAY APPARATUS, DISPLAY CONTROL APPARATUS, DISPLAY METHOD, AND PROGRAM - A display system has a display apparatus configured to be attached to a head of the user, and a display control apparatus connected to the display apparatus. The display control apparatus has a detection part that detects a tilt of the user, and a transmitting part that transmits a first tilt image representing the tilt to the display apparatus, and the display apparatus has a receiving part that receives the first tilt image from the display control apparatus, and a display part that displays the first tilt image. | 12-03-2015 |
20150346812 | METHODS AND APPARATUS FOR RECEIVING CONTENT AND/OR PLAYING BACK CONTENT - Content delivery and playback methods and apparatus are described. The methods and apparatus are well suited for delivery and playback of content corresponding to a 360 degree environment and can be used to support streaming and/or real time delivery of 3D content corresponding to an event, e.g., while the event is ongoing or after the event is over. Portions of the environment are captured by cameras located at different positions. The content captured from different locations is encoded and made available for delivery. A playback device selects the content to be received based in a user's head position. | 12-03-2015 |
20150346814 | GAZE TRACKING FOR ONE OR MORE USERS - One or more techniques and/or systems are provided for gaze tracking of one or more users. A user tracking component (e.g., a depth camera or a relatively lower resolution camera) may be utilized to obtain user tracking data for a user. The user tracking data is evaluated to identify a spatial location of the user. An eye capture camera (e.g., a relatively higher resolution camera) may be selected from an eye capture camera configuration based upon the eye capture camera having a view frustum corresponding to the spatial location of the user. The eye capture camera may be invoked to obtain eye region imagery of the user. Other eye capture cameras within the eye capture camera configuration are maintained in a powered down state to reduce power and/or bandwidth consumption. Gaze tracking information may be generated based upon the eye region imagery, and may be used to perform a task. | 12-03-2015 |
20150346818 | SYSTEM AND METHOD FOR DETECTING MICRO EYE MOVEMENTS IN A TWO DIMENSIONAL IMAGE CAPTURED WITH A MOBILE DEVICE - A system and method for using a mobile device to capture an image of eyes of a person and calculate a location or coordinates of one or each of the irises in a first frame and a location or coordinate of the same iris in a later frame. A vector may be calculated between the location or coordinate of a first iris in a first frame and a location or coordinate of the first iris in a second frame. A second eye vector may be calculated between the location or coordinate of a second iris in a first frame and such second iris in a second frame. An angle between the first iris vector and the second iris vector may be calculated. Such angle may be used as an indication of a physiological problem. | 12-03-2015 |
20150346820 | Radar-Based Gesture-Recognition through a Wearable Device - This document describes techniques and devices for radar-based gesture-recognition through a wearable device. The techniques enable an easy-to-use input interface through this wearable radar device, in contrast to small or difficult-to-use input interfaces common to wearable computing devices. Further, these techniques are not limited to interfacing with wearable computing devices, but may aid users in controlling various non-wearable devices, such as to control volume on a stereo, pause a movie playing on a television, or select a webpage on a desktop computer. | 12-03-2015 |
20150346824 | Electronic Devices with Low Power Motion Sensing and Gesture Recognition Circuitry - A user may provide input to a handheld electronic device by making gestures with the handheld electronic device. The input gestures may be used to generate remote control signals for external equipment. The electronic device may include circuitry for detecting a user's movement of the handheld electronic device. The circuitry may include an accelerometer operable in normal and low power modes and a processor operable in sleep and wake modes. Motion data gathered in low power mode may be stored in a dedicated low power mode data storage structure. Motion analysis circuitry may detect a user's movement of the electronic device based on a standard deviation of the motion data. Upon detecting a user's movement of the handheld electronic device, the processor may be woken up to analyze the motion data stored in the low power mode data storage structure and to confirm the user's movement of the electronic device. | 12-03-2015 |
20150346827 | Mobile Control Unit and Method for Providing a Gesture Control of a Mobile Control Unit - A mobile control unit having an activation element, in which the mobile control unit is designed for performing a gesture control and the gesture control can be activated and/or deactivated by means of the activation element. The mobile control unit has a display device, wherein the display device is designed for indicating an activation state of the gesture control. Furthermore, a method for providing a gesture control of a mobile control unit, wherein the mobile control unit comprises at least an activation element and a display device. | 12-03-2015 |
20150346828 | GESTURE CONTROL METHOD, GESTURE CONTROL MODULE, AND WEARABLE DEVICE HAVING THE SAME - A gesture control method, a gesture control module, and a wearable device having the same. The gesture control method includes the following steps: executing a setting procedure, including: capturing a hand image; identifying a plurality of identification points from the hand image; recording at least one relative distance between each identification point and other identification points; and executing a controlling procedure, including: determining whether a quantity of the identification points or their relative distances have changed; and if yes, generating a corresponding command according to changes in the quantity of the identification points or their relative distances. | 12-03-2015 |
20150346829 | CONTROL METHOD OF ELECTRONIC APPARATUS HAVING NON-CONTACT GESTURE SENSITIVE REGION - A control method of an electronic apparatus is provided. The electronic apparatus has a non-contact gesture sensitive region. The control method includes: identifying at least one object type of at least one non-contact object within the non-contact gesture sensitive region in a plurality of object types; determining respective numbers of non-contact objects corresponding to the identified at least one object type; detecting motion information of the at least one non-contact object within the non-contact gesture sensitive region; recognizing a non-contact gesture corresponding to the at least one non-contact object according to the identified at least one object type, the respective numbers of non-contact objects and the motion information; and enabling the electronic apparatus to perform a specific function according to the non-contact gesture. | 12-03-2015 |
20150346830 | CONTROL METHOD OF ELECTRONIC APPARATUS HAVING NON-CONTACT GESTURE SENSITIVE REGION - A control method of an electronic apparatus is provided. The electronic apparatus includes a display surface, and provides a gesture sensitive region near the display surface. The control method includes the following steps: determining motion information of a non-contact object around the electronic apparatus, wherein the non-contact object moves between an inside and an outside of the gesture sensitive region to generate the motion information; recognizing a non-contact gesture corresponding to the non-contact object according to the motion information; and enabling the electronic apparatus to perform a specific function according to the non-contact gesture. | 12-03-2015 |
20150346832 | METHODS AND APPARATUS FOR DELIVERING CONTENT AND/OR PLAYING BACK CONTENT - Content delivery and playback methods and apparatus are described. The methods and apparatus are well suited for delivery and playback of content corresponding to a 360 degree environment and can be used to support streaming and/or real time delivery of 3D content corresponding to an event, e.g., while the event is ongoing or after the event is over. Portions of the environment are captured by cameras located at different positions. The content captured from different locations is encoded and made available for delivery. A playback device selects the content to be received based in a user's head position. | 12-03-2015 |
20150346835 | PROCESSING OF GESTURE-BASED USER INTERACTIONS USING VOLUMETRIC ZONES - Systems and methods for processing gesture-based user interactions within an interactive display area are provided. The display of one or more virtual objects and user interactions with the one or more virtual objects may be further provided. Multiple interactive areas may be created by partitioning an area proximate a display into multiple volumetric spaces or zones. The zones may be associated with respective user interaction capabilities. A representation of a user on the display may change as the ability of the user to interact with one or more virtual object changes. | 12-03-2015 |
20150346836 | METHOD FOR SYNCHRONIZING DISPLAY DEVICES IN A MOTOR VEHICLE - A method synchronizes two display devices in a motor vehicle. The first display device already displays a first data set which comprises a first display content relating to a predetermined thematic context. A sensor device, in particular a PMD camera, detects as a control gesture of a user, a free movement in the chamber of a body part of the user. A signal which is then generated by the sensor device and which describes the control gesture, is transmitted to a gesture recognition device which generates a control signal in accordance with the determined control gesture. In accordance with the control signal, a control device determines the predetermined thematic context of the determined data set. Using the defined predetermined thematic context, an additional data set which includes a second display content of the predetermined thematic context is provided. This additional data set is displayed on the second display device. | 12-03-2015 |
20150346843 | MULTI-FUNCTIONAL PORTABLE DEVICE CONTROLLED BY EXTERNAL INFORMATION - A multi-functional portable device including a display module driven by an actuator and configured to be able to display one or more display functions simultaneously or sequentially. A control unit is configured to receive outside information and control the display module according to said one or more display functions, based on the received outside information. The device also includes a computer medium including portions of code for a software application having a plurality of display functions, each of the display functions being configured to be executed in the multifunctional device. | 12-03-2015 |
20150347821 | DISPLAY SCREEN CONTROLLING APPARATUS IN MOBILE TERMINAL AND METHOD THEREOF - A method for controlling a display screen of a mobile terminal includes acquiring a first image and a second image distinguished from the first image, using a cut-off filter provided in the camera attached to a front of the mobile terminal to cut off one of R (Red), G (Green) and B (Blue) signals, recognizing an object comprising a user's face or gesture based on the first image and second image, and controlling on/off of the display screen provided in the mobile terminal based on the result of the recognition. | 12-03-2015 |
20150348495 | WEARABLE DEVICE AND METHOD OF CONTROLLING THE SAME - Disclosed are a wearable device and method of controlling the same, which output more suitable image information on the basis of a state in which the wearable device is covered. The wearable device includes a wearable device body, a display unit provided in the wearable device body, a sensing unit included in the wearable device body, and configured to sense a region, in which the display unit is covered by an object, and a region in which the display unit is not covered, and a control unit configured to display first image information in a predetermined first region when the region in which the display unit is not covered corresponds to the predetermined first region, and display second image information in a predetermined second region when the region in which the display unit is not covered corresponds to the predetermined second region, as a result in which a portion of the display unit is covered by the object. | 12-03-2015 |
20150352438 | INFORMATION PROCESSING SYSTEM, NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM HAVING STORED THEREIN INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING APPARATUS, AND INFORMATION PROCESSING METHOD - An information processing system includes an information processing unit configured to move a virtual camera in a virtual space within a predetermined movable range in response to a movement and/or a posture of a controller. The information processing unit changes a correspondence relation between the movement and/or the posture of the controller and a capturing direction and/or a viewing point position of the virtual camera after the movement of the virtual camera reaches a limit of the movable range due to the movement of the controller in a predetermined direction. | 12-10-2015 |
20150353005 | THIN OVERHEAD CONSOLE - An overhead console for a vehicle having a lighting assembly. The lighting assembly may include multiple light sources configured to emit light toward a show surface of the overhead console. Each light source may include a light-emitting diode. The overhead console may also include a capacitive sensing input device, a sunroof control device, a haptic feedback device, a directional lighting device, and so forth. The overhead console has a thickness of less than or equal to approximately 15 millimeters. | 12-10-2015 |
20150355457 | COMPACT ANCHOR FOR EMS DISPLAY ELEMENTS - This disclosure provides systems, methods and apparatus for shutter-based EMS light modulators controlled by electrode actuators that include a compact anchor. The compact anchor includes four sides. Each of the four sides includes a lower wall, while only three of the sides include a lower shelf, upper wall and eave. That is, a first side includes a lower wall having an upper surface that is substantially the same thickness as the material forming the lower wall. | 12-10-2015 |
20150355709 | WEARABLE DEVICE AND METHOD OF CONTROLLING THEREFOR - A method of controlling a wearable device according to one embodiment of the present specification can include the steps of displaying content on a display unit of the wearable device, sensing a tilt angle of the wearable device and providing a control interface providing control of the content. And, the step of providing the control interface can include the steps of mapping the control interface to the ground based on the sensed tilt angle and a state of the wearable device and displaying the mapped control interface on the display unit. | 12-10-2015 |
20150355717 | SWITCHING INPUT RAILS WITHOUT A RELEASE COMMAND IN A NATURAL USER INTERFACE - User input in the form of image data is received from a user via a natural user interface. A vector difference between an adjustment start position and a current position of the user input is calculated. The vector difference includes a vector position and a vector length. The vector position is compared to stored rail data, and the vector length is compared to a stored threshold length. The rail data describes a plurality of virtual rails associated with an application. Based on the comparisons, the user input is matched to one of the plurality of virtual rails and a notification describing the matching is provided to the application. The application, thereupon, transitions from a first command to a second command corresponding to the matching virtual rail without receiving any explicit termination gesture for the first command from the user. | 12-10-2015 |
20150355718 | PREEMPTIVE MACHINE LEARNING-BASED GESTURE RECOGNITION - A system and method for detecting a viewing gesture with respect to a wrist-worn device employ a logistic-regression model to pre-learn gesture metrics for on events. An output model is produced for deployment on a consumer device, allowing real-time gesture detection with high accuracy and low latency. | 12-10-2015 |
20150355719 | METHOD AND SYSTEM ENABLING CONTROL OF DIFFERENT DIGITAL DEVICES USING GESTURE OR MOTION CONTROL - A system for controlling operation of digital devices according to gesture of its user comprising: at least one gesture indicating hardware; at least one imaging device for capturing at least one first array of images of the at least one gesture indicating hardware; and a processor for processing the captured stream of images for; identifying at least one second array of images of the at least one gesture indicating hardware from the at least one first array of images captured by the at least one imaging device; determining at least one geometrical characteristic of bodies present in the at least one second array of images and their variation to construct at least one motion path of the at least one gesture indicating hardware; determining at least one key coordinates point representing at least one motion path; and generating at least one operating instruction to control at least one operation of at least one digital device. Noise cancellation while gesture detection is carried out by evaluating errors as a function of relative distances, boundary distances, and characteristics of various bodies identified beyond a threshold image size. Another approach of removing noise in gesture detection is to fix the emitting device in a still environment and identifying still bodies except emitter and thereby removing those stilt objects while capturing emitting device in motion. | 12-10-2015 |
20150355720 | USER INTERACTION WITH WEARABLE DEVICES - Particular embodiments described herein provide for an electronic device that can be configured to determine that an unobtrusive gesture has been received on a first electronic device and send a signal to a second electronic device in response to the unobtrusive gesture. The first electronic device can also be configured to receive a signal from the second electronic device, determine an unobtrusive output in response to the signal, and generate an unobtrusive notification in response to the received signal. In an example, the first electronic device is a part of jewelry worn by a user. | 12-10-2015 |
20150355736 | A CONTROL INPUT SYSTEM - A control input system comprising: an input unit comprising, a ferromagnetic material medium, a non-invasive detachable attachment means for attaching the ferromagnetic medium to the hand of the user; a control unit comprising, a control input area defining the input interface for interaction with the ferromagnetic medium, a sensor unit capable of registering the position of N the ferromagnetic medium of the input unit relative to the control unit and providing at least one sensor output signal, a signal processor for converting the at least one sensor output signal to a control output signals representing the position of the ferromagnetic material relative to the control input, a signal transmitter for transmitting processed signals to an external device, and a housing for enclosing the sensor unit, the signal process and/or the signal transmitter. | 12-10-2015 |
20150360711 | Steering Wheel with Data Transmission via a Finger Navigation Module - A steering wheel for a motor vehicle is disclosed. The steering wheel includes an optical finger navigation module, at least one further operating element, a steering wheel electronic system and a databus for data transmission between the optical finger navigation module and the steering wheel electronic system. The at least one further operating element is directly connected to the optical finger navigation module via a signal line. The optical finger navigation module is designed in such a way as to receive a signal from the at least one further operating element and relay it on to the steering wheel electronic system via the databus. In this way, no dedicated lines from the operating element to the steering wheel electronic system are necessary. Thus, the construction space for the installation of a device consisting of an optical finger navigation module and a further operating element in a steering wheel is reduced. | 12-17-2015 |
20150362855 | ORGANIC LIGHT EMITTING DEVICE - Provided is an organic light emitting device, including: a lower electrode; an upper electrode and a substrate having an electrode pad portion, in which when an angle formed by a slant in section of a film end in a region between the lower electrode and the electrode pad portion and a surface of the substrate is given as θ | 12-17-2015 |
20150362951 | THEMATIC COORDINATION OF AN INFORMATION DEVICE AND A WEARABLE BAND - Embodiments relate generally to a portable information device and associated wearable band and, more particularly, to thematic coordination of a portable information device and associated wearable band. One example embodiment includes a portable electronic information device comprising a display and a communication interface configured to receive identification from an attached wearable band—the received identification is used to determine one or more thematic elements intended to correspond with at least one visual attribute of a surface of the wearable band, and one or more of these thematic elements are presented on the information device display. Another example embodiment includes a wearable band comprising an interface configured for electrically coupling to a portable electronic information device—communication over the interface is used by the information device, the wearable band, or both to instantiate thematic coordination between the wearable band and the information device. Additionally, a further example embodiment includes a method for a portable electronic information device to obtain and utilize, from a remote server, information associated with one or more thematic elements intended to correspond with a coupled wearable band. | 12-17-2015 |
20150362988 | SYSTEMS AND METHODS FOR USER INDICATION RECOGNITION - Various methods and apparatuses are provided to determine an occurrence of a first user indication and determine a plurality of Points of Interest (POI) corresponding to the first user indication, and to determine an occurrence of a second user indication and responsively determine a narrowed POI from the plurality of POIs. An action corresponding to the narrowed POI and the first and/or second user indication is determined and effected. | 12-17-2015 |
20150362989 | DYNAMIC TEMPLATE SELECTION FOR OBJECT DETECTION AND TRACKING - Object tracking, such as may involve face tracking, can utilize different detection templates that can be trained using different data. A computing device can determine state information, such as the orientation of the device, an active illumination, or an active camera to select an appropriate template for detecting an object, such as a face, in a captured image. Information about the object, such as the age range or gender of a person, can also be used, if available, to select an appropriate template. In some embodiments instances of templates can be used to process various orientations, while in other embodiments specific orientations, such as upside down orientations, may not be processed for reasons such as rate of inaccuracies or infrequency of use for the corresponding additional resource overhead. | 12-17-2015 |
20150362990 | DISPLAYING A USER INPUT MODALITY - An aspect provides a method, including: receiving, from a sensor, gaze tracking data associated with a user's eye; determining, using a processor, a location, on an information handling device, associated with the user's gaze based upon the gaze tracking data; identifying, using a processor, content associated with the location determined; and displaying, using a display, a user gaze based input modality based upon the content identified. Other aspects are described and claimed. | 12-17-2015 |
20150362993 | Systems and Methods for Foley-Style Haptic Content Creation - One illustrative system disclosed herein includes an audio input device configured to detect an audio input and transmit an audio signal associated with the audio input. The illustrative system also includes a haptic trigger device which includes a sensor, wherein the sensor is configured to transmit a sensor signal associated with an event. Further, the illustrative system includes a processor in communication with the audio input device and the haptic trigger device, the processor configured to: receive the audio signal; receive the sensor signal substantially simultaneously to receiving the audio signal; record the audio signal to an audio track; and insert a haptic effect marker into a haptic track based in part on the sensor signal, wherein the haptic track and the audio track are associated with a video. | 12-17-2015 |
20150362994 | MOBILE DEVICE WITH MOTION CONTROLLING HAPTICS - A haptically enabled device includes a haptic output device used to control motion. The haptically enabled device determines a desired motion, and then generates a haptic effect on the haptic output device to cause the desired motion. | 12-17-2015 |
20150363000 | METHOD AND APPARATUS FOR CONTROLLING A SYSTEM VIA A SENSOR - A saturation profile is defined in a processor as saturation across at least a substantial portion a sensor's field of view of a sensor. A saturation response is defined in the processor, including an executable instruction. The input is sensed with a sensor and communicated to the processor, and compared to the saturation profile. If the input satisfies the saturation profile, the saturation response is executed. Saturation may include maximum sensor values, minimum sensor values, invalid sensor values, error sensor values, a uniform sensor values, and/or other uninformative or “null data” input, as valid input for system commands, etc. | 12-17-2015 |
20150363003 | SCALABLE INPUT FROM TRACKED OBJECT - A computing device ( | 12-17-2015 |
20150363004 | IMPROVED TRACKING OF AN OBJECT FOR CONTROLLING A TOUCHLESS USER INTERFACE - A computing device ( | 12-17-2015 |
20150363045 | VIRTUAL INPUT SYSTEM - For a user having a user input actuator, a virtual interface device, such as for a gaming machine, for determining actuation of a virtual input by the input actuator is disclosed. The device comprises a position sensing device for determining a location of the user input actuator and a controller coupled to the position sensing device, the controller determining whether a portion of the user input actuator is within a virtual input location in space defining the virtual input. | 12-17-2015 |
20150363069 | DISPLAY CONTROL - A display control unit ( | 12-17-2015 |
20150364109 | ARCHITECTURES FOR INPUT TRACKING - A tracking architecture is provided that enables data for gestures and head positions to be provided to both native and non-native clients on a computing device. A pipeline component can obtain the raw image data and sensor data and synchronize that data to be processed to determine, for example, location and/or motion data that may correspond to device input. The data can be processed by separate components, such as an event publisher and an event provider, each capable of filtering the location, motion, and/or raw sensor data to generate a set of event data. The event data then can be published to registered listeners or provided in response to polling requests. Head coordinates, gesture data, and other such information can be passed through one or more interface layers enabling the data to be processed by a non-native client on the device. | 12-17-2015 |
20150364113 | METHOD FOR CONTROLLING FUNCTION AND ELECTRONIC DEVICE THEREOF - A method for operating an electronic device is provided. The method includes connecting to a Head Mounted Device (HMD), receiving an input through the HMD while the HMD is connected, and in response to the received input, performing a function corresponding to the received input. | 12-17-2015 |
20150366383 | RECEPTACLE WITH A DISPLAY - A coffee mug comprises an energy harvesting element, such as a peltier element, for harvesting electric energy from the temperature difference between said liquid and an ambient of the receptacle. In addition the mug comprises a data processing device, a display device and a data communication device for receiving data wirelessly from an outer application system. The energy harvesting element is configured to supply electric energy to said data communication device, data processing device and display device. The data processing device is then configured to receive said data via said data communication device and control said display device to display said received data. | 12-24-2015 |
20150366479 | BIOELECTRODE, AND METHOD AND APPARATUS FOR PROCESSING BIOSIGNAL USING THE SAME - A bioelectrode including a plate, a first electrode disposed on a first side of the plate, and a second electrode disposed on the first side of the plate and separate from the first electrode. The bioelectrode further includes a first guard portion disposed on a second side of the plate, a second guard portion disposed on the second side of the plate and separate from the first guard portion, and a preamplifier configured to output a voltage signal based on a biosignal measured between the first electrode and the second electrode. | 12-24-2015 |
20150370290 | ELECTRONIC APPARATUS, METHOD, AND STORAGE MEDIUM - According to one embodiment, a portable electronic apparatus including a contactless communication module configured to execute contactless communication with an external device is provided. The electronic apparatus includes a first detector, a second detector, and a controller. The first detector is configured to detect contactless communication with the external device. The second detector is configured to detect an orientation of the electronic apparatus. The controller is configured to execute control according to the detected orientation when the contactless communication with the external device is detected. | 12-24-2015 |
20150370319 | APPARATUS AND METHOD FOR CONTROLLING THE APPARATUS BY A USER - An apparatus and a method for controlling the same by a user are suggested. The suggested apparatus comprises: a detecting unit for detecting a first event executed by a user in relation to a plurality of devices including the apparatus; a control unit for generating a second event according to the first event detected by the detecting unit; an output unit for presenting the second event to the user, wherein the control unit generates a command for executing the first event on the apparatus as a function of a response of the user to the second event detected by the detecting unit. The suggested apparatus and method are suitable to provide a second event, before executing a command corresponding to a first event from the user, for the user to confirm the intention of the first event. | 12-24-2015 |
20150370322 | METHOD AND APPARATUS FOR BEZEL MITIGATION WITH HEAD TRACKING - The present disclosure presents methods and apparatuses for operating a multi-display device to mitigate the effects of image interruption due to bezels between individual display devices. For example, a method of operating a video device includes generating a bezel-corrected image which spans a plurality of display devices, the bezel-corrected image including masked image pixels, wherein the masked image pixels are associated with a bezel of at least one of the plurality of display devices. Such example methods may further include detecting a head position change of a user and displaying one or more of the masked image pixels on at least one of the plurality of display devices based on the head position change. | 12-24-2015 |
20150370323 | USER DETECTION BY A COMPUTING DEVICE - In some embodiments, an electronic device optionally identifies a person's face, and optionally performs an action in accordance with the identification. In some embodiments, an electronic device optionally determines a gaze location in a user interface, and optionally performs an action in accordance with the determination. In some embodiments, an electronic device optionally designates a user as being present at a sound-playback device in accordance with a determination that sound-detection criteria and verification criteria have been satisfied. In some embodiments, an electronic device optionally determines whether a person is further or closer than a threshold distance from a display device, and optionally provides a first or second user interface for display on the display device in accordance with the determination. In some embodiments, an electronic device optionally modifies the playing of media content in accordance with a determination that one or more presence criteria are not satisfied. | 12-24-2015 |
20150370324 | ADJUSTING CONTENT DISPLAY ORIENTATION ON A SCREEN BASED ON USER ORIENTATION - A system for adjusting content display orientation on a screen is disclosed. The system may include a processor that may detect both eyes and a body part of a user that is proximal to one or more of the user's eyes. The system may then determine an eye gaze plane based on the positions of the first and second eyes of the user. The eye gaze plane may be determined by identifying a first line of sight extending from the first eye and a second line of sight extending from the second eye. Additionally, the eye gaze plane may bisect a center of the first eye and a center of the second eye of the user. Once the eye gaze plane is determined, the system may adjust the orientation of content displayed on a display device based on the eye gaze plane and on the position of the body part. | 12-24-2015 |
20150370325 | Context-Aware Self-Calibration - A method for context-aware self-calibration includes measuring for a plurality of time segments, at least one feature of at least one biosignal or each of at least one channel. Each biosignal is created in response to a user imagining an intended direction for each time segment. An object is moved along an actual decoded direction determined by an output of a decoder configured to correlate for each time segment the at least one feature to the intended direction. The decoder self-calibrates to minimize for each time segment, an error between the actual decoded direction, and the intended direction inferred subsequent to the respective time segment. | 12-24-2015 |
20150370326 | SYSTEMS, ARTICLES, AND METHODS FOR WEARABLE HUMAN-ELECTRONICS INTERFACE DEVICES - Systems, articles, and methods for wearable human-electronics interfaces are described. A wearable human-electronics interface device includes a band that in use is worn on an appendage (e.g., a wrist, arm, finger, or thumb) of a user. The band carries multiple sensors that are responsive to vibrations. The sensors are physically spaced apart from one another on or within the band. The band also carries an on-board processor. The sensors detect vibrations at the appendage of the user when the user performs different finger tapping gestures (i.e., tapping gestures involving different individual fingers or different combinations of fingers) and provide corresponding vibration signals to the processor. The processor classifies the finger tapping gesture(s) based on the vibration signals and an on-board transmitter sends a corresponding signal to control, operate, or interact with a receiving electronic device. The sensors include inertial sensors, digital MEMS microphones, or a combination thereof. | 12-24-2015 |
20150370327 | VIRTUAL INPUT DEVICE AND VIRTUAL INPUT METHOD - A virtual inputting device comprises: a signal collection unit including a bioelectrical sensor for collecting bioelectrical signals and an acceleration sensor for collecting acceleration signals; a signal preprocessing unit for performing preprocessing for the bioelectrical signals and the acceleration signals collected by the signal collection unit; a signal segmentation unit for performing segmentation processing for the preprocessed bioelectrical signals and acceleration signals so as to obtain a plurality of gesture segments; a feature extracting unit for extracting feature values from the bioelectrical signals and the acceleration signals for respective gesture segments; a feature combination unit for combining feature values extracted from the feature extracting unit to form a combined feature vector; a gesture recognition unit for performing gesture recognition based on the combined feature vector; and a character mapping unit for obtaining characters corresponding to the recognized gesture according to a predetermined mapping relationship between characters and gestures. | 12-24-2015 |
20150370328 | A DISPLAY APPARATUS PROVIDING TACTILE FUNCTIONALITY - An apparatus comprising: at least two actuators configured to provide a force to move a display assembly component at least at two separate locations of the display assembly component such that at least one of the at least two separate locations of the display assembly component has a displacement based on at least one actuation input; at least one sensor configured to determine the displacement of the display assembly component, wherein the sensor is configured to provide a feedback signal; and a control unit configured to control at least one of the at least two actuators based on the at least one actuation input to the control unit and the feedback signal. | 12-24-2015 |
20150370331 | SURFACING RELATED CONTENT BASED ON USER INTERACTION WITH CURRENTLY PRESENTED CONTENT - A method and system for, at an electronic device with one or more processors, presenting a media content item on an electronic display. The electronic device detects a user action with a respective portion of the media content item, wherein the user action does not include explicit selection of the respective portion of the media content item. In response to detection of the user action, the electronic device identifying additional content to present based on the content included in the portion of the respective portion of the media content item. The electronic device then simultaneously presents the additional content and the media content item on the electronic display. | 12-24-2015 |
20150370333 | SYSTEMS, DEVICES, AND METHODS FOR GESTURE IDENTIFICATION - Systems, devices, and methods adapt established concepts from natural language processing for use in gesture identification algorithms. A gesture identification system includes sensors, a processor, and a non-transitory processor-readable memory that stores data and/or instructions for performing gesture identification. A gesture identification system may include a wearable gesture identification device. The gesture identification process involves segmenting signals from the sensors into data windows, assigning a respective “window class” to each data window, and identifying a user-performed gesture based on the corresponding sequence of window classes. Each window class exclusively characterizes at least one data window property and is analogous to a “letter” of an alphabet. Under this model, each gesture is analogous to a “word” made up of a particular combination of window classes. | 12-24-2015 |
20150370334 | DEVICE AND METHOD OF CONTROLLING DEVICE - Provided are a device and a method of controlling the device. The method includes obtaining information regarding type of a hovering input unit configured to transmit a hovering input to the device; and displaying a user interface corresponding to the obtained information regarding type of the hovering input unit from among a plurality of user interfaces related to an application executed on the device. | 12-24-2015 |
20150370336 | Device Interaction with Spatially Aware Gestures - Described is a system and technique for providing the ability for a user to interact with one or more devices by performing gestures that mimic real-world physical analogies. More specifically, the techniques described herein provide the ability for a user to interact with a device by limiting the conscious gesturing for a computer component by camouflaging computer-recognizable gestures within manipulations of a physical objects. | 12-24-2015 |
20150370337 | APPARATUS AND METHOD FOR CONTROLLING INTERFACE - In an apparatus and method for controlling an interface, a user interface (UI) may be controlled using information on a hand motion and a gaze of a user without separate tools such as a mouse and a keyboard. That is, the UI control method provides more intuitive, immersive, and united control of the UI. Since a region of interest (ROI) sensing the hand motion of the user is calculated using a UI object that is controlled based on the hand motion within the ROI, the user may control the UI object in the same method and feel regardless of a distance from the user to a sensor. In addition, since positions and directions of view points are adjusted based on a position and direction of the gaze, a binocular 2D/3D image based on motion parallax may be provided. | 12-24-2015 |
20150370341 | Electronic Apparatus And Display Control Method Thereof - An electronic apparatus and a display control method are described. The display control method includes determining whether the external input device is connected with the electronic apparatus; controlling the display component to display the graphic user interactive interface with a first display mode in a state where the external input device is not connected with the electronic apparatus; and controlling the display component to display the graphic user interactive interface with a second display mode in a state where the external input device is connected with the electronic apparatus, wherein layout of operable objects in the graphic user interactive interface displayed with the first display mode and layout of operable objects in the graphic user interactive interface displayed with the second display mode are different. | 12-24-2015 |
20150370346 | MAGNETIC CONTROLLER FOR DEVICE CONTROL - Systems, methods and apparatus for using a magnetic controller to control device. In one aspect, a system includes a magnetic controller external to a device, the magnetic controller including: a magnetic device for altering a surrounding magnetic field of a device; one or more input actuators, each operatively coupled to the magnetic device and that when actuated cause the magnetic device alter the surrounding magnetic field according to a predefined change associated with the input actuator; and a model executable by the device and that models as device inputs the differences in the surrounding magnetic field of the device caused by the actuation of the one or more input actuators. | 12-24-2015 |
20150370529 | USER INTERFACE FOR MANIPULATING USER INTERFACE OBJECTS WITH MAGNETIC PROPERTIES - The present disclosure relates to user interfaces for manipulating user interface objects. A device, including a display and a rotatable input mechanism, is described in relation to manipulating user interface objects. In some examples, the manipulation of the object is a scroll, zoom, or rotate of the object. In other examples, objects are selected in accordance with simulated magnetic properties. | 12-24-2015 |
20150375399 | USER INTERFACE FOR MEDICAL ROBOTICS SYSTEM - An exemplary illustration of a user interface for a medical robotics system may include multiple light sources configured to illuminate a gesture within a field of view. The user interface may further include multiple cameras, which have a field of view and are configured to generate a detection signal in response to detecting the gesture within the field of view. The user interface can also have a controller configured to generate a command signal based on the detection signal. The command signal may be configured to actuate an instrument driver, a display device, a C-arm configured or any combination thereof to perform a function mapped to the gesture. | 12-31-2015 |
20150378430 | METHOD AND APPARATUS FOR CONTROLLING PRESENTATION OF MEDIA CONTENT - A method that incorporates teachings of the subject disclosure may include, for example, determining a viewing orientation of a first viewer in a viewing area from a plurality of images captured from the viewing area during a presentation of media content, determining whether an attentiveness level of the first viewer during the presentation of the media content is below a threshold by correlating the viewing orientation with the presentation of the media content, and performing an operation associated with the presentation of the media content at a display, where the operation is selected from a first viewer profile associated with the first viewer according to the viewing orientation responsive to determining that the attentiveness level is below the threshold. Other embodiments are disclosed. | 12-31-2015 |
20150378433 | DETECTING A PRIMARY USER OF A DEVICE - A device may identify a user among multiple individuals detected by comparing a physiological condition, such as a heart rate, of the multiple detected individuals as detected by a camera of the device with a physiological condition of a user of the device using different sensors. A heart rate may be detected by a camera by monitoring blood flow to an individual's face. A heart rate may be detected by a motion sensor by monitoring vibrations of the device that are in an expected frequency range. If a heart rate of a user matches a heart rate of an individual seen by a camera, that individual may be determined to be the user of the device. The position of the individual may be used to then render a user interface. | 12-31-2015 |
20150378434 | MULTIMODAL HAPTIC EFFECT SYSTEM - A device to output a haptic effect includes a haptic effect generator comprising one or more microdroplets of a fluid configured to output a haptic effect, and a substrate configured to control movement of the one or more microdroplets of fluid. The device further includes an actuator coupled to the haptic effect generator configured to exert one or more forces on the substrate to cause the one or more microdroplets of fluid to output the haptic effect. | 12-31-2015 |
20150378439 | OCULAR FOCUS SHARING FOR DIGITAL CONTENT - A position within displayed digital content that a user is ocularly focused on (e.g., where within displayed content the user is looking) may be determined. Digital content comprising a visual indication of the position may be rendered. The visual indication of the position may be displayed on the same display that the user is looking at and/or a different display. In some embodiments, the position may be determined based on data generated by a sensor physically attached to the user. Additionally or alternatively, the position may be determined based on data generated by a stationary computing device comprising a sensor configured to track changes in ocular position of the user. In some embodiments the digital content may comprise digital images and/or video (e.g., broadcast content, on-demand content, images and/or video associated with a computer application, or the like). | 12-31-2015 |
20150378440 | Dynamically Directing Interpretation of Input Data Based on Contextual Information - Technologies are described herein for dynamically directing an interpretation of input data based on contextual information associated with a virtual environment. According to one aspect of the disclosure, a computing device and a camera operate in concert to capture and interpret gestures of a human target to control a virtual skeleton, which may be visually represented as an avatar. Embodiments disclosed herein utilize filtering parameters in the interpretation of input data representing a state of the human target to generate output data that is used to direct the virtual skeleton and/or the avatar. The filtering parameters may be dynamically adjusted during runtime based on contextual information and other factors to dynamically change the way input data is interpreted. Dynamic adjustment of the filtering parameters during runtime may allow for an interpretation of input data that is more accurately aligned with a scenario presented in the virtual environment. | 12-31-2015 |
20150378441 | INPUT APPARATUS - An input apparatus includes: a light source that emits detection light toward a first direction, and raster scans a detection interface defined in space, with the detection light; a first light sensor that is disposed closer to the light source than the detection interface, and detects first reflected light which is the detection light that has been reflected; a second light sensor that detects second reflected light which is the detection light that has been reflected off an instructing body that has entered a detection region extending from the detection interface toward the light source; and a control unit that detects a coordinate value of the instructing body using received-light data obtained by the first light sensor and the second light sensor receiving light at the same timing. | 12-31-2015 |
20150378444 | HAND POINTING ESTIMATION FOR HUMAN COMPUTER INTERACTION - Hand pointing has been an intuitive gesture for human interaction with computers. A hand pointing estimation system is provided, based on two regular cameras, which includes hand region detection, hand finger estimation, two views' feature detection, and 3D pointing direction estimation. The technique may employ a polar coordinate system to represent the hand region, and tests show a good result in terms of the robustness to hand orientation variation. To estimate the pointing direction, Active Appearance Models are employed to detect and track, e.g., 14 feature points along the hand contour from a top view and a side view. Combining two views of the hand features, the 3D pointing direction is estimated. | 12-31-2015 |
20150378446 | METHOD FOR SELECTING AN ENTRY FOR AN APPLICATION USING A GRAPHICAL USER INTERFACE - A method for executing an application through a user interface displayed by an electronic device includes the steps of displaying on the user interface at least one application icon, an application icon being associated to an application, displaying on the user interface at least one input icon, an input icon being associated to an input mode for one or more applications, one of the at least one application icon or the at least one input icon being part on the user interface of a list of icons movable along a first direction of the user interface, capturing one or more user inputs along the first direction to move the list of icons in the first direction, detecting a user input linking an input icon with an application icon to select respectively the associated input mode and application, and executing the selected application using the selected input mode. | 12-31-2015 |
20150378452 | ELECTRONIC DEVICE, CONTROL METHOD, AND STORAGE MEDIUM STORING CONTROL PROGRAM - According to an aspect, an electronic device includes a first face, a second face, a display unit arranged on the first surface, a notification unit arranged on the second surface, an attitude detecting unit, and a control unit. The attitude detecting unit detects whether attitude of the electronic device is a first attitude, in which the first surface faces upward in the vertical direction, or a second attitude, in which the first surface faces downward in the vertical direction. When the attitude detecting unit detects the second attitude, the control unit prevents displaying on the display unit and enables the notification unit to give a notification. | 12-31-2015 |
20150378547 | ENHANCED EXPERIENCE - Providing enhanced experience is described. Information obtained based on a location of an controller device on a controller surface is used to provide enhancing information. | 12-31-2015 |
20150378662 | Display Switching Method, Information Processing Method And Electronic Device - A wearable electronic device and a display switching method thereof, includes turning on a first display apparatus of the wearable electronic device to make the first display apparatus in a working state; rendering a first image in a first display region of the first display apparatus with a first display effect by the first display apparatus; obtaining first parameter information related to an input operation, by a sensing apparatus of the wearable electronic device; judging whether a second display apparatus of the wearable electronic device is to be turned on or not, according to the sensing parameter; and rendering a second image in a second display region of the second display apparatus with a second display effect by the second display apparatus, when the second display apparatus is to be turned on to make the second display apparatus in the working state. | 12-31-2015 |
20160004079 | CONTROL DEVICE, HEAD-MOUNT DISPLAY DEVICE, PROGRAM, AND CONTROL METHOD FOR DETECTING HEAD MOTION OF A USER - It is possible to provide a technique for accurately performing an operation desired by a user. A user's head operation is identified according to information detected by a head motion detection unit. A process desired by the user is executed according to an angular velocity of the head motion. Moreover, the technique uses a control unit which can accurately execute an operation by the user's head operation without reflecting the return motion of the user's head in the process. The control unit executes a process for a start and an end of each process corresponding to the detected angular velocity according to a predetermined threshold value. | 01-07-2016 |
20160004303 | EYE GAZE TRACKING SYSTEM AND METHOD - An eye gaze tracking system is disclosed. The system includes a gaze data acquisition system including a plurality of light sources and a plurality of image sensors. The plurality of light sources are arranged to emit light to a head of the user, and the plurality of image sensors are configured to receive the light. In an embodiment, the system further includes a gaze tracking module including an ocular feature extraction module, a point of regard (POR) calculation module and a POR averaging module. The ocular feature extraction module is configured to process the gaze data and to extract ocular features, and is configured to determine a confidence value associated with an accuracy of the parameters. The POR calculation module is configured to determine a POR from the ocular features. The POR averaging module is configured to determine an average POR. | 01-07-2016 |
20160004310 | FORCE FEEDBACK SYSTEM INCLUDING MULTI-TASKING GRAPHICAL HOST ENVIRONMENT - A force feedback system provides components for use in a force feedback system including a host computer and a force feedback interface device. An architecture for a host computer allows multi-tasking application programs to interface with the force feedback device without conflicts. One embodiment of a force feedback device provides both relative position reporting and absolute position reporting to allow great flexibility. A different device embodiment provides relative position reporting device allowing maximum compatibility with existing software. Information such as ballistic parameters and screen size sent from the host to the force feedback device allow accurate mouse positions and graphical object positions to be determined in the force feedback environment. Force feedback effects and structures are further described, such as events and enclosures. | 01-07-2016 |
20160004317 | CONTROL METHOD AND ELECTRONIC DEVICE - A control method and an electronic device are provided in this application. The control method includes: determining a first operating area where an operator performs a gesture operation according to sensing information obtained by the at least two sensing units, where, sensing information from various spatial areas are obtained the at least two sensing units, the first operating area includes at least one spatial area of the various spatial areas; and generating and executing a first control instruction corresponding to the first operating area based on at least a configuration relationship between an operating area and an instruction, where the first control instruction is different from a second control instruction which is generated when the operator performs a gesture operation in a second operating area. | 01-07-2016 |
20160004318 | SYSTEM AND METHOD OF TOUCH-FREE OPERATION OF A PICTURE ARCHIVING AND COMMUNICATION SYSTEM - A method of controlling a PACS image viewer using a system comprising one or more sensors configured to interpret muscle electrical activity as hand gestures is described. The method comprises accepting user input from the one or more sensors comprising hand motion, vector, and gesture information, sending such information to the processor according to a frame rate, and translating such information into a virtual input by the processor according to a set of computer-executable instructions, wherein the virtual input is configured to control a PACS image viewer. The virtual input simulates one or more key strokes, mouse clicks, or cursor movements and allows a physician to use a PACS image viewer without using any hand-operated equipment such as a mouse, trackpad, or keyboard. In this way, the physician may scroll through images using the PACS image viewer while maintaining a sterile environment. | 01-07-2016 |
20160004319 | APPARATUS AND METHOD FOR RECOGNIZING A MOVING DIRECTION OF GESTURE - An apparatus and method to recognize a moving direction of gesture are provided. A final moving direction of gesture is determined using the number of intersecting points based on output values from one or more sensors disposed in up, down, left, right directions and a code of an accumulated sum. Thus, a moving direction of target (hand) that moves on the sensors is recognized more accurately. The apparatus to recognize a moving direction of a gesture includes first to fourth sensor disposed at a position that is north, south, west, and east from a center; and a controller configured to identify a number of intersecting points based on output values of the first sensor, the second sensor, the third sensor and the fourth sensor and to estimate a moving direction of the gesture according to the number of the intersecting points. | 01-07-2016 |
20160004321 | INFORMATION PROCESSING DEVICE, GESTURE DETECTION METHOD, AND GESTURE DETECTION PROGRAM - It is possible to operate a device while reducing erroneous recognition of a device even when a line of sight does not face the device in an accurate direction. An information processing device that detects a motion of a user includes a detecting unit that detects the motion of the user and a face direction of the user from an image photographed by an imaging unit, a detected motion determining unit that determines whether or not the motion detected by the detecting unit is an operation on the information processing device based on the face direction of the user and position information of the information processing device stored in a storage unit, and a display content control unit that causes a display unit to reflect the motion when the detected motion determining unit determines that the motion detected by the detecting unit is the operation on the information processing device. | 01-07-2016 |
20160004323 | METHODS, SYSTEMS, AND APPARATUSES TO UPDATE SCREEN CONTENT RESPONSIVE TO USER GESTURES - In one embodiment, an electronic device to be worn on a user's forearm includes a display and a set of one or more sensors that provide sensor data. In one aspect, a device may detect, using sensor data obtained from a set of sensors, that a first activity state of a user is active. The device may determine, while the first activity state is active, that the sensor data matches a watch check rule associated with the first activity state. Responsive to the detected match, the device may cause a change in visibility of the display. | 01-07-2016 |
20160004386 | GESTURE RECOGNITION DEVICE AND GESTURE RECOGNITION METHOD - A gesture recognition device includes a processor; and a memory which stores a plurality of instructions, which when executed by the processor, cause the processor to execute: acquiring, on a basis of an image of an irradiation region irradiated with projector light, the image being picked up by an image pickup device, first color information representative of color information of a hand region when the projector light is not irradiated on the hand region and second color information representative of color information of the hand region when the projector light is irradiated on the hand region; and extracting, from the image picked up by the image pickup device, a portion of the hand region at which the hand region does not overlap with a touch region irradiated with the projector light on a basis of the first color information and extracting a portion of the hand region. | 01-07-2016 |
20160004403 | APPARATUS AND METHOD FOR PROCESSING SCROLL INPUT IN ELECTRONIC DEVICE - A terminal device and method are disclosed herein. The terminal device includes an input unit for detecting a scroll input, and a controller for executing the method, which includes analyzing the scroll input to detect a scroll step indicating an amount to be scrolled and a direction of movement indicating a scroll direction, and controlling a display of the terminal device to display scrolling of data to a portion of the data corresponding to the scroll step and the scroll direction. | 01-07-2016 |
20160009411 | VISUAL SEARCH ASSISTANCE FOR AN OCCUPANT OF A VEHICLE | 01-14-2016 |
20160011411 | ANTI-PEEPING DEVICE AND CONTROL METHOD | 01-14-2016 |
20160011654 | INTERFACING APPARATUS AND USER INPUT PROCESSING METHOD | 01-14-2016 |
20160011656 | COMPUTING DEVICE WITH FORCE-TRIGGERED NON-VISUAL RESPONSES | 01-14-2016 |
20160011657 | System and Method for Display Enhancement | 01-14-2016 |
20160011658 | SYSTEMS AND METHODS OF EYE TRACKING CALIBRATION | 01-14-2016 |
20160011662 | ELECTRONIC DEVICE, METHOD, AND COMPUTER PROGRAM PRODUCT | 01-14-2016 |
20160011664 | HAPTIC NOTIFICATIONS UTILIZING HAPTIC INPUT DEVICES | 01-14-2016 |
20160011665 | OPERATIONAL FEEDBACK WITH 3D COMMANDS | 01-14-2016 |
20160011668 | 3D GESTURE RECOGNITION | 01-14-2016 |
20160011669 | GESTURE RECOGNITION SYSTEMS AND DEVICES | 01-14-2016 |
20160011673 | WIRELESS POSITIONING APPROACH USING TIME-DELAY OF SIGNALS WITH A KNOWN TRANSMISSION PATTERN | 01-14-2016 |
20160011677 | ANGLE-BASED ITEM DETERMINATION METHODS AND SYSTEMS | 01-14-2016 |
20160011776 | SYSTEM AND METHOD FOR CONTROLLING DIVIDED SCREENS INDEPENDENTLY THROUGH MAPPING WITH INPUT DEVICES | 01-14-2016 |
20160011840 | VIDEO IMMERSION INDUCING APPARATUS AND VIDEO IMMERSION INDUCING METHOD USING THE APPARATUS | 01-14-2016 |
20160013396 | VIBRATION DETECTING APPARATUS AND MOBILE DEVICE INCLUDING THE SAME | 01-14-2016 |
20160018639 | HEADS-UP DISPLAY WITH INTEGRATED DISPLAY AND IMAGING SYSTEM - Embodiments of an apparatus comprising a light guide including a proximal end, a distal end, a display positioned near the proximal end, an ocular measurement camera positioned at or near the proximal end to image ocular measurement radiation, a proximal optical element positioned in the light guide near the proximal end and a distal optical element positioned in the light guide near the distal end. The proximal optical element is optically coupled to the display, the ocular measurement camera and the distal optical element and the distal optical element is optically coupled to the proximal optical element, the ambient input region and an input/output optical element. Other embodiments are disclosed and claimed. | 01-21-2016 |
20160018885 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM - An information processing apparatus includes circuitry configured to provide a user interface to a user, determine whether a predetermined object is present in an input recognition area of a sensor, and determine a region of interest within the user interface. The circuitry is further configured to determine, while the predetermined object is determined to be present in the input recognition area, whether the region of interest within the user interface changes. The circuitry is further configured to perform an operation based on whether the region of interest is determined, while the predetermined object is determined to be present in the input recognition area, to change. | 01-21-2016 |
20160018888 | INDICATION OF EYE TRACKING INFORMATION DURING REAL-TIME COMMUNICATIONS - Embodiments disclosed herein provide methods, systems, and computer readable storage media for indicating eye tracking information during a real-time communication session. In a particular embodiment, a method provides receiving first eye tracking information captured by a first computing system operated by a first user during the communication session with a second computing system operated by a second user, wherein the first eye tracking information represents a first location on a display of the first computing system to where eyes of the first user are directed. The method further provides determining a second location on a display of the second computing system that corresponds to the first location and instructing the display of the second computing system to display a first indication of the second location in real-time with the communication session. | 01-21-2016 |
20160018892 | HAND MOTION-CAPTURING DEVICE WITH FORCE FEEDBACK SYSTEM - This disclosure includes a hand motion-capturing device with a force feedback system. In some embodiments the device includes a base, a microcontroller connected to the base, a thumb sensor module and four-finger sensor modules each electrically connected to the microcontroller. In some embodiments, the device may include five-link rods that interconnect the thumb sensor module to the base and each of the four-finger sensor modules to the base. In some embodiments, the device includes a thumb force feedback system adapted and configured to receive a human thumb and a four-finger force feedback system adapted and configured to receive an index finger, a middle finger, a ring finger, and a little finger. The thumb force feedback system and the four-finger force feedback system may each be movably connected to respective link rods and the thumb sensor module and four-finger sensor modules, respectively. | 01-21-2016 |
20160018896 | MULTI-PROCESS INTERACTIVE SYSTEMS AND METHODS - A multi-process interactive system is described. The system includes numerous processes running on a processing device. The processes include separable program execution contexts of application programs, such that each application program comprises at least one process. The system translates events of each process into data capsules. A data capsule includes an application-independent representation of event data of an event and state information of the process originating the content of the data capsule. The system transfers the data messages into pools or repositories. Each process operates as a recognizing process, where the recognizing process recognizes in the pools data capsules comprising content that corresponds to an interactive function of the recognizing process and/or an identification of the recognizing process. The recognizing process retrieves recognized data capsules from the pools and executes processing appropriate to contents of the recognized data capsules. | 01-21-2016 |
20160018897 | THREE-DIMENSIONAL USER INTERFACE DEVICE AND THREE-DIMENSIONAL OPERATION PROCESSING METHOD - A three-dimensional user interface device includes a coordinate setting unit setting a three-dimensional coordinate space (3DCS) on the basis of a line-of-sight image, a virtual data generation unit generating three-dimensional area data representing a transparent virtual three-dimensional operation area (V3DOA) arranged in an arbitrary position in the 3DCS in a state in which at least a boundary of the area is visible, a display processing unit displaying a V3DOA represented by the generated three-dimensional area data by using a visible space in the 3DCS corresponding to a space shown on the line-of-sight image as a display reference, and an operation detection unit detecting an operation performed by the operator with the specific region in the V3DOA on the basis of the three-dimensional position acquired with respect to the specific region of the operator in the 3DCS and a position of the V3DOA in the 3DCS. | 01-21-2016 |
20160018898 | RAISE GESTURE DETECTION IN A DEVICE WITH PREHEATING OF A PROCESSOR - A wearable computing device can detect device-raising gestures. For example, onboard motion sensors of the device can detect movement of the device in real time and infer information about the spatial orientation of the device. Based on analysis of signals from the motion sensors, the device can detect a raise gesture, which can be a motion pattern consistent with the user moving the device's display into his line of sight. In response to detecting a raise gesture, the device can activate its display and/or other components. Detection of a raise gesture can occur in stages, and activation of different components can occur at different stages. | 01-21-2016 |
20160018900 | WAKING A DEVICE IN RESPONSE TO USER GESTURES - A wearable computing device can detect device-raising gestures. For example, onboard motion sensors of the device can detect movement of the device in real time and infer information about the spatial orientation of the device. Based on analysis of signals from the motion sensors, the device can detect a raise gesture, which can be a motion pattern consistent with the user moving the device's display into his line of sight. In response to detecting a raise gesture, the device can activate its display and/or other components. Detection of a raise gesture can occur in stages, and activation of different components can occur at different stages. | 01-21-2016 |
20160018901 | ENABLING DATA TRACKING WITHOUT REQUIRING DIRECT CONTACT - A system and method for activating and deactivating a device that is capable of performing monitoring functions such as the recording of physical parameters such as heart rate, time exercised, or capable of recording using a video camera, and may be referred to as a Tracker, wherein the system and method enables a user to quickly activate or deactivate a function by simple movement of a signaling device near the Tracker, wherein the Tracker may be toggled on and off whenever the signaling device is moved adjacent to the Tracker, and wherein toggling may be indicated by an audible, visual and/or haptic indicator. | 01-21-2016 |
20160019697 | DEVICE DISPLAY PERSPECTIVE ADJUSTMENT - This invention relates to a system, method and computer program product for displaying a picture on a device display comprising: identifying a person from a new face in an device camera image; identifying a reference eye separation distance of the identified viewer; calculating distance and angle of the new face from the device based on the reference eye separation and an image eye separation; applying a perspective transformation on picture based on the distance and angle of the new face; and displaying the transformed picture on the device display. | 01-21-2016 |
20160025971 | EYELID MOVEMENT AS USER INPUT - Embodiments related to eyelid tracking on a computing device are disclosed. For example, one disclosed embodiment provides a head-mounted computing device comprising an image sensor positioned to acquire an image of an eyelid when worn on a head, a logic system, and a storage system. The storage system comprises instructions stored thereon that are executable by the logic system to capture image data of an eyelid via the image sensor, track a movement of the eyelid via the image data, track an eyelid state based upon the captured image data of the eyelid, and take an action on the computing device based upon the eyelid state. | 01-28-2016 |
20160025981 | SMART PLACEMENT OF VIRTUAL OBJECTS TO STAY IN THE FIELD OF VIEW OF A HEAD MOUNTED DISPLAY - An HMD device is configured to check the placement of newly introduced objects in a virtual reality environment such as interactive elements like menus, widgets, and notifications to confirm that the objects are significantly present within the user's field of view. If the intended original placement would locate the object outside the field of view, the HMD device relocates the object so that a portion of the object is viewable at the edge of the HMD display closest to its original placement. Such smart placement of virtual objects enables the user to readily discover new objects when they are introduced into the virtual reality environment, and then interact with the objects within a range of motions and/or head positions that is comfortable to support a more optimal interaction and user experience. | 01-28-2016 |
20160026214 | WEARABLE FLEXIBLE INTERFACE WITH INTERLOCKING MODULES - A wearable flexible interface with interlocking modules includes a substrate integrated into a wristband, and also each of a processor, memory and power source disposed on the substrate, as well as a data bus. A multiplicity of different modules each are coupled to the data bus, each including firmware and a flexible display. Finally, a display controller is disposed on the substrate and coupled to each of the processor, memory, power source and each of the modules by way of the data bus. The display controller includes program code enabled to selectively direct a display of a display characteristic of a particular one of the modules either in a corresponding flexible display of the particular one of the modules, or in single composite display formed by aggregating the flexible displays of all of the modules. | 01-28-2016 |
20160026237 | SCREEN CHANGING DEVICE, SCREEN CHANGING METHOD, AND SCREEN CHANGING PROGRAM - A screen changing device includes: an operation log organization unit | 01-28-2016 |
20160026241 | System and Method for Monitoring Habits of User of Electronic Device - The invention includes a method for monitoring habits user of electronic device. The method has steps of providing a presetting module for capturing and storing characteristics of the registered user in a presetting mode. The registered user characteristics including a) a first horizontal distance, the first horizontal distance being a first reference distance between two facial features of the registered user; b) a first vertical distance, the first vertical distance being a second reference distance between two facial features of the registered user; c) a second horizontal distance, the second horizontal being a third reference distance between two non-facial features of the registered body below the registered user's head, the two non-facial features capturable by the camera's view during use of the electronic device; and d) a second vertical distance, the second vertical distance being a fourth reference distance between the non-facial features and a facial feature of the registered user. | 01-28-2016 |
20160026243 | 3D-Volume Viewing by Controlling Sight Depth - A medical image data processing method for determining a set of medical image data to be displayed, the data processing method being constituted to be executed by a computer and comprising the following steps: a) acquiring medical image data comprising three-dimensional medical image information describing an anatomical structure, and displaying the medical image information; b) acquiring navigation display feature data comprising navigation display feature information describing at least one graphical navigation feature; c) displaying the navigation display information simultaneously with the medical image information; d) acquiring viewing direction data comprising viewing direction information describing a spatial relationship of a viewing direction of a user relative to the position of the at least one graphical navigation feature; e) determining, based on the viewing direction data, image information subset data comprising image data subset information describing a subset of the medical image information to be selected for display. | 01-28-2016 |
20160026244 | GUI DEVICE - A GUI (Graphical User Interface) device includes a projection unit that projects an image on a plurality of aerial screens overlapped in a predetermined gaze direction, a detection unit that detects a position of an instruction unit in an aerial region, and a selection unit that selects any one of the plurality of aerial screens as an operation object according to a motion of the detected instruction unit. | 01-28-2016 |
20160026248 | Haptic Communication System, Method and Device - A haptic communication system, comprising two haptic input/output devices connected to a computer network, said haptic input/output devices each comprising: at least one actuator in a housing, wherein said actuator comprises a motor, a carrier being moved in said at least two opposite directions by said motor, and a touch member having an outer surface, said touch member being arranged to be moved at predetermined intervals in at least two opposite directions relative to said housing, such that a user can feel said movement by touching said outer surface; wherein computer software is arranged to use relative pressure, relative location and the difference between the current relative location and at least the relative location determined in the previous interval, of two corresponding actuators in said two devices to calculate two next relative locations of outer surfaces, one for each of said actuators, and to communicate said next relative positions to the respective actuators; and wherein said actuators are arranged to move said outer surfaces instantly to the calculated and communicated next relative location. | 01-28-2016 |
20160026252 | SYSTEM AND METHOD FOR GESTURE RECOGNITION - Various aspects of a system and a method for gesture recognition may comprise a server communicatively coupled to a plurality of devices. The server receives a plurality of gesture profiles associated with each of the one or more devices. The server determines a base gesture for a pre-defined gesture based on the received plurality of gesture profiles. The base gesture encapsulates a plurality of variations of the pre-defined gesture for a plurality of users associated with the plurality of devices. | 01-28-2016 |
20160026255 | SYSTEMS AND METHODS FOR PROXIMITY SENSOR AND IMAGE SENSOR BASED GESTURE DETECTION - Systems, methods, and non-transitory computer-readable media are disclosed. For example, a dual sensor control device is disclosed that includes at least one processor for receiving information from a proximity sensor and an image sensor. The processor may be configured to receive first data from the proximity sensor while the image sensor is in a first state, determine, using the first data, a presence of an object in proximity to the proximity sensor. The processor may also be configured to output, based on the determined presence of the object in proximity to the proximity sensor, a signal to the image sensor to cause the image sensor to enter a second state, different from the first state. The processor may also be configured to receive second data from the image sensor in the second state, and output at least one of a message and a command associated with the second data. | 01-28-2016 |
20160026263 | PHYSICAL SURFACE INTERACTION - In accordance with an example aspect of the present invention, there is provided an apparatus comprising at least one receiver configured to receive an identifier of an accessory and sensor information, at least one processing core configured to obtain, based at least in part on the sensor information, a location of the apparatus on a physical surface and a second location in a virtual space, the first location corresponding to the second location via a mapping, and to cause transmission of the identifier of the accessory and information identifying the second location. | 01-28-2016 |
20160026264 | DIRECT THREE-DIMENSIONAL POINTING USING LIGHT TRACKING AND RELATIVE POSITION DETECTION - A computing system for direct three-dimensional pointing includes at least one computing device, and a pointing/input device including at least one light source and a motion sensor module for determining absolute and relative displacement of the pointing/input device. At least one imaging device is configured for capturing a plurality of image frames each including a view of the light source as the pointing/input device is held and/or moved in a three-dimensional space. Two or more imaging devices may be provided. A computer program product calculates at least a position and/or a motion of the light source in three-dimensional space from the plurality of sequential image frames and from the pointing/input device absolute and relative displacement information, and renders on the graphical user interface a visual indicator corresponding to the calculated position and/or the motion of the light source. Methods for direct three-dimensional pointing and command input are described also. | 01-28-2016 |
20160026423 | MOUNTABLE DISPLAY DEVICES - The present disclosure provides display devices and methods. A display device can include a visual curvilinear display mounted on a support member. A user may display or project media through the visual curvilinear display according to a display and/or location preference or schedule of the user. | 01-28-2016 |
20160026426 | IMAGE DISPLAY DEVICE AND METHOD OF CONTROLLING THESAME - Disclosed herein are an image display device and a method of controlling the image display device. The image display device includes an image acquirer configured to acquire a user image, an image outputter configured to display the user image, a communicator configured to perform communication with a mobile terminal, a controller configured to perform a real time image display operation. The real time image display operation includes displaying the user image in real time and transmitting image data processed from the user image to the mobile terminal when a real time image display command is input. | 01-28-2016 |
20160026431 | MOBILE DEVICE AND CONTROLLING METHOD THEREFOR - The present invention relates to a mobile device and a method for controlling the same. In particular, the present invention relates to a mobile device and a method for controlling the same, which provides response date to a voice command by variously setting a feedback scheme for providing the response data depending on the way that a user uses the mobile device. | 01-28-2016 |
20160026434 | SYSTEM AND METHOD FOR CONTINUOUS MULTIMODAL SPEECH AND GESTURE INTERACTION - Disclosed herein are systems, methods, and non-transitory computer-readable storage media for processing multimodal input. A system configured to practice the method continuously monitors an audio stream associated with a gesture input stream, and detects a speech event in the audio stream. Then the system identifies a temporal window associated with a time of the speech event, and analyzes data from the gesture input stream within the temporal window to identify a gesture event. The system processes the speech event and the gesture event to produce a multimodal command. The gesture in the gesture input stream can be directed to a display, but is remote from the display. The system can analyze the data from the gesture input stream by calculating an average of gesture coordinates within the temporal window. | 01-28-2016 |
20160027375 | WEARABLE FLEXIBLE INTERFACE WITH INTERLOCKING MODULES - A wearable flexible interface with interlocking modules includes a substrate integrated into a wristband, and also each of a processor, memory and power source disposed on the substrate, as well as a data bus. A multiplicity of different modules each are coupled to the data bus, each including firmware and a flexible display. Finally, a display controller is disposed on the substrate and coupled to each of the processor, memory, power source and each of the modules by way of the data bus. The display controller includes program code enabled to selectively direct a display of a display characteristic of a particular one of the modules either in a corresponding flexible display of the particular one of the modules, or in single composite display formed by aggregating the flexible displays of all of the modules. | 01-28-2016 |
20160027403 | DISPLAY CONTROL APPARATUS, DISPLAY CONTROL METHOD, DISPLAY CONTROL SYSTEM, AND STORAGE MEDIUM - A display control apparatus, which executes control required to display a moving image, received via a network, on a display unit, the apparatus comprising: an input unit configured to input an operation instruction required to operate the display control apparatus; and a control unit configured to execute control required to display, on the display unit, a moving image at a frame rate depending on whether or not processing according to the operation instruction input by the input unit is executed. | 01-28-2016 |
20160033993 | Wearable Device And Electric Storage Equipment - A wearable device and an electric storage equipment are provided according to the present application. The wearable device includes a transparent window, a frame, a controller, and a sensor. A parameter of the wearing state of the wearable device can be acquired by the sensor, and a state of the frame can be controlled by the controller according to the parameter. When the wearable device is worn by the user, the frame is controlled to be in a first state to allow the frame to be invisible, and when the wearable device is not worn by the user, the frame is controlled to be in a second state to allow the frame to be visible or identifiable. The electric storage equipment may allow a transmittance of a set area to be switchable, thus, articles inside a casing can be observed without opening the door. | 02-04-2016 |
20160034030 | REFLECTION-BASED CONTROL ACTIVATION - An electronic device detects an object in a field of view of a camera and determines a position of a reflection of the object in a reflective surface of the electronic device as viewed from a defined observation point. When the electronic device detects selection input from a user, a control of the electronic device is activated based on an alignment between a location on a reflective surface of the electronic device and the reflection of the object on the reflective surface. | 02-04-2016 |
20160034032 | WEARABLE GLASSES AND METHOD OF DISPLAYING IMAGE VIA THE WEARABLE GLASSES - Provided are wearable glasses including a display and a method of operating the same. The wearable glasses include: a display configured to display an image; a sensor configured to acquire wear state information representing a state in which a user currently wears the wearable glasses, while the image is being displayed on the display; and a processor configured to determine an inclination of the wearable glasses with respect to the user based on the acquired wear state information, and to adjust, based on the determined inclination, the image that is displayed on the display. | 02-04-2016 |
20160034035 | ACCELERATION SENSE PRESENTATION APPARATUS, ACCELERATION SENSE PRESENTATION METHOD, AND ACCELERATION SENSE PRESENTATION SYSTEM - An acceleration sense presentation apparatus includes a vibrator group including a plurality of vibrators; an input unit to which information having directivity is inputted; and a vibration control unit for continuously switching a vibrator to be vibrated based on the information having directivity. | 02-04-2016 |
20160034036 | OLED MULTI-USE INTELLIGENT CURTAIN AND METHOD - An intelligent architectural curtain system includes a foldable, retractable curtain which has two fabric portions separated by a flexible OLED display screen portion, which is used when deployed against the wall of an interior space. A controllable motor is used to controllably retract and deploy the foldable, retractable curtain to change between a folded state and a taut state. A sensor generates a control signal for actuating the motor in response to detecting at least one of a user gesture and reception of a remote control signal. A processor includes circuitry used to respond to the control signal by actuating the motor to automatically retract or deploy the foldable, retractable curtain and display a predetermined image from an image source on the flexible OLED display screen when the foldable, retractable curtain is deployed. | 02-04-2016 |
20160034037 | DISPLAY DEVICE AND CONTROL METHOD THEREOF, GESTURE RECOGNITION METHOD, AND HEAD-MOUNTED DISPLAY DEVICE - The present invention provides a display device and a control method thereof, a gesture recognition method and a head-mounted display device. The display device of the present invention comprises a display unit, an open type head-mounted display, an image acquisition unit and a gesture recognition unit. The control method thereof comprises steps of: providing, by the open type head-mounted display, a virtual 3D control picture to a user; acquiring, by the image acquisition unit, an image of action of touching the virtual 3D control picture by the user; judging, by the gesture recognition unit, a touch position of the user in the virtual 3D control picture according to the image acquired by the image acquisition unit, and sending a control instruction corresponding to the touch position to a corresponding execution unit. The present invention may be used for controlling a display device. | 02-04-2016 |
20160034038 | INTERACTIVE RECOGNITION SYSTEM AND DISPLAY DEVICE - An interactive recognizing system and a display device provided with the interactive recognizing system, which belongs to a technical field of interactive control. The interactive recognizing system comprises: a sensing unit ( | 02-04-2016 |
20160034041 | WEARABLE DEVICE AND METHOD OF OPERATING THE SAME - Disclosed is wearable device including a sensor that detects a movement of a peripheral object, a display unit that displays a plurality of items and displaying a focus on at least one of the plurality of items, and a processor that controls the display unit to move the focus onto an item at a position corresponding to a moving direction of the peripheral object. | 02-04-2016 |
20160034043 | BUTTONLESS DISPLAY ACTIVATION - In one example, a method includes determining, by a first motion module of a computing device and based on first motion data measured by a first motion sensor at a first time, that the mobile computing device has moved, wherein a display operatively coupled to the computing device is deactivated at the first time; responsive to determining that the computing device has moved, activating a second motion module; determining, by the second motion module, second motion data measured by a second motion sensor, wherein determining the second motion data uses a greater quantity of power than determining the first motion data; determining a statistic of a group of statistics based on the second motion data; and responsive to determining that at least one of the group of statistics satisfies a threshold, activating the display. | 02-04-2016 |
20160034044 | METHOD AND APPARATUS FOR CONTROLLING MULTI-EXPERIENCE TRANSLATION OF MEDIA CONTENT - A method or apparatus for controlling a media device using gestures may include, for example, modifying media content to generate first updated media content according to a comparison of first information descriptive of a first environment of the source device to second information descriptive of a second environment of the recipient device, capturing images of a gesture, identifying a command from the gesture, and modifying the first updated media content to generate second updated media content according to the command. Other embodiments are disclosed. | 02-04-2016 |
20160034047 | FLEXIBLE DEVICE AND INTERFACING METHOD THEREOF - A foldable device includes a sensor configured to sense an unfolding motion of the foldable device, a display configured to display a layout in which a representation of at least one object varies according to the sensed unfolding motion, and a controller configured to control the display of the layout so that the representation of the at least one object corresponds to the sensed unfolding motion. | 02-04-2016 |
20160034051 | AUDIO-VISUAL CONTENT NAVIGATION WITH MOVEMENT OF COMPUTING DEVICE - Methods and apparatus for navigating audio-visual content on a computing device are provided. Embodiments of the system allow a user of the device to navigate the audio-visual content through an application interface using a movement of the device in various directions. A motion detection component built in the device can detect the movement of the device and the detected motion can be translated into one of commands saved in a database. The command causes the application interface to display an updated audio-visual content reflecting the command, which is associated with a particular movement of the device. In some embodiments, the updated audio-visual content can be shared with other computing devices in connection with each other. | 02-04-2016 |
20160034141 | DELAY OF DISPLAY EVENT BASED ON USER GAZE - Methods and systems of delaying the execution of a display event based on a detected user gaze are provided. Display events may be generated and executed to change a user interface of a display. For example, an autocorrect algorithm can automatically replace a typed word with a corrected word in a text field, generating a display event that causes the corrected word to be displayed instead of the typed word. Such a display event may be executed as soon as possible after its generation. However, a gaze detection device can obtain information that indicates a user is not looking at the typed word on the display. In such a situation, it may be more intuitive to delay the execution of the display event until the gaze information indicates that the user is looking at the typed word. | 02-04-2016 |
20160034244 | DYNAMIC MERCHANDISING COMMUNICATION SYSTEM - Provided herein are display systems and units, including those configured for dynamic communication in a physical location, such as in retail settings. Also included herein are methods for dynamically displaying product information in a physical location, such as a retail setting. | 02-04-2016 |
20160034251 | DISPLAY DEVICE, METHOD OF CONTROLLING DISPLAY DEVICE, AND PROGRAM - A head mounted display device is used by being mounted on a body of a user and includes an image display unit through which outside scenery is transmitted and which displays an image such that the image is visually recognizable together with the outside scenery. Further, the head mounted display device includes a right headphone and a left headphone outputting a sound. Further, the head mounted display device includes a target detection unit that detects a target of the user in the visual line direction; a distance detection unit that detects a distance between the detected target and the user; and an information output control unit that controls the output of the sound of the right headphone and the left headphone according to the detected distance. | 02-04-2016 |
20160035310 | DISPLAY DEVICE AND ITS WORKING METHOD - The disclosure discloses a display device comprising a display unit; a support unit for supporting the display unit; a plurality of sensors, each of which is used for sensing a signal emitted by a signal source to identify the distance between the signal source and the sensor; a positioning unit for determining the position of the signal source according to a plurality of identified distances; and a control unit for adjusting the form and/or azimuth of the display unit according to the determined position of the signal source. The display device of the disclosure can automatically regulates the form and/or azimuth of the display screen according to the position of a viewer or a signal source such as an infrared remote controller, thereby realizing a best viewing angle and a best viewing effect for the user. | 02-04-2016 |
20160041384 | WAVEGUIDE EYE TRACKING EMPLOYING VOLUME BRAGG GRATING - A transparent waveguide, which is for use in tracking an eye that is illuminated by infrared light having an infrared wavelength, includes a volume Bragg grating type of input-coupler adapted to receive infrared light having the infrared wavelength and couple the received infrared light into the waveguide. The volume Bragg grating includes a lower boundary and an upper boundary that is closer to the output-coupler than the lower boundary. A k-vector angle of the volume Bragg grating at the lower boundary is greater than a k-vector angle at the upper boundary, with k-vector angles of the volume Bragg grating between the lower and upper boundaries gradually decreasing as distances decrease between grating planes of the volume Bragg grating and the upper boundary. Additionally, the volume Bragg grating preferably has an angular bandwidth that is no greater than 5 degrees. | 02-11-2016 |
20160041581 | FLEXIBLE DISPLAY SCREEN SYSTEMS AND METHODS - A flexible display screen system and method includes an article that has a flexible material configured to cover at least a portion of a person's anatomy or at least a portion of an object. At least one flexible display screen is secured on the flexible material. A computer system is configured to provide image information for controlling a display of images on the flexible display screen. The computer system is configured to control the display of images on the flexible display screen in response to the sensor signal. The sensor may sense flexing of the flexible display screen or flexible material. Alternatively or in addition, the flexible display screen or flexible material may be controlled to flex, in response to the images displayed on the flexible display screen. | 02-11-2016 |
20160041612 | Method for Selecting an Information Source from a Plurality of Information Sources for Display on a Display of Smart Glasses - A method for determining an information source from a plurality of information sources for display on smart glasses is disclosed In one embodiment, the method includes determining a spatial orientation of the smart glasses, determining from a plurality of criteria, which are respectively related to the spatial orientation of the smart glasses, a criterion which is fulfilled by the determined spatial orientation of the smart glasses, assigning each of the plurality of criteria an information source from the plurality of information sources, and selecting the information source assigned to the determined criterion. | 02-11-2016 |
20160041613 | Method for Selecting an Information Source for Display on Smart Glasses - A method for selecting an information source from a plurality of information sources for display on a display of smart glasses is disclosed. In one embodiment, the method includes determining an orientation of the smart glasses with respect to a head of a wearer of the smart glasses, and selecting an information source from the plurality of information sources based at least in part on the determined orientation of the smart glasses with respect to the head of the wearer of the smart glasses. | 02-11-2016 |
20160041614 | SYSTEM AND METHOD OF INDUCING USER EYE BLINK - According to an exemplary embodiment, a blink device includes: a blink detector configured to detect a blink of a user from an image photographed by a camera; a period measurer configured to measure a time period between consecutive blinks of the user and determine whether the measured time period exceeds a reference time period; a real-time image capture unit configured to capture a real-time image of the user from the photographed image; and an image processor configured to perform the following if the measured time period exceeds the reference time period: generate from the real-time image a blink image for inducing the user to blink, delay the display of the blink image by a delay period from when the user's blink is detected, and display the blink image after the delaying. | 02-11-2016 |
20160041615 | INFORMATION PROCESSING APPARATUS, FOCUS DETECTION METHOD, AND INFORMATION PROCESSING SYSTEM - An information processing apparatus includes: a shutter configured to block extraneous light which enters into an eye; an irradiation unit configured to irradiate a marker to the eye by infrared light; a photographing unit configured to photograph the marker which is projected onto a fundus of the eye to be examined when the shutter is closed; and a processing unit configured to detect a focus based on an image of the marker photographed by the photographing unit. | 02-11-2016 |
20160041617 | Radar-Based Gesture Recognition - This document describes techniques using, and devices embodying, radar-based gesture recognition. These techniques and devices can enable a great breadth of gestures and uses for those gestures, such as gestures to use, control, and interact with computing and non-computing devices, from software applications to refrigerators. The techniques and devices are capable of providing a radar field that can sense gestures from multiple actors at one time and through obstructions, thereby improving gesture breadth and accuracy over many conventional techniques. | 02-11-2016 |
20160041620 | INPUT APPARATUS, DEVICE CONTROL METHOD, RECORDING MEDIUM, AND MOBILE APPARATUS - An input apparatus for controlling an information terminal includes a gesture detecting unit that detects a rotational movement of a wrist of a user about a lower arm of the user and an output unit that outputs a control command for controlling a device to be controlled to the device to be controlled on the basis of a rotational direction of the detected rotational movement. If the gesture detecting unit detects a first rotational movement in a first rotational direction and thereafter detects a second rotational movement in a second rotational direction that is opposite to the first rotational direction, the output unit outputs a second control command corresponding to the second rotational direction without outputting a first control command corresponding to the first rotational direction. | 02-11-2016 |
20160041623 | Sessionless pointing user interface - A method, including receiving, by a computer, a sequence of three-dimensional maps containing at least a hand of a user of the computer, and identifying, in the maps, a device coupled to the computer. The maps are analyzed to detect a gesture performed by the user toward the device, and the device is actuated responsively to the gesture. | 02-11-2016 |
20160041628 | FLYING USER INTERFACE - This invention describes a special type of drone called “Flying User Interface” device comprised of a robotic projector-camera system, an onboard digital computer connected with Internet, sensors, and a hardware interface to stick to any surface such as wall, ceilings, etc. Computer further consists of other subsystems, devices, and sensors such as accelerometer, compass, gyroscope, flashlight, etc. Drone flies from one places to another, detects a surface, and stick to it. After successful sticking mechanism, device stops all its rotators and projects or augments images, information, and user interfaces on the near by surfaces. User interface may contain applications, information about object being augmented and information from Internet. User can interact with user-interface using command and gestures such as hand, body, feet, voice, etc. | 02-11-2016 |
20160042714 | DISPLAY SYSTEM, IMAGE COMPENSATION METHOD AND NON-TRANSITORY COMPUTER READABLE STORAGE MEDIUM THEREOF - A display system, an image compensation method and a non-transitory computer readable storage medium thereof are provided. The display system includes a flexible panel, a prediction unit, a compensation unit, an image synthesis unit and a control unit. The prediction unit predicts a prediction angle of the flexible panel in a final time. The compensation unit generates a first compensation image according to an initial display angle of the flexible panel in an initial time, and generates a second compensation image according to the prediction angle. The image synthesis unit synthesizes a first display image according to the first compensation image and the second compensation image. The control unit selectively substitutes the first display image for an image displayed on the flexible panel in the final time. | 02-11-2016 |
20160048027 | VIRTUAL REALITY EXPERIENCE TIED TO INCIDENTAL ACCELERATION - Incidental acceleration is used to provide an entertainment experience, where incidental acceleration is defined in one sense as acceleration that is not controlled by a user, that is in many cases present because of the user's presence in a traveling vehicle, such as an airplane, train, or car. Such may be employed to significantly improve the user's traveling experience, particularly where the user is wearing virtual reality goggles or the like, because the user will be less aware of the confined nature of the vehicle, and significantly more aware of the virtual reality environment. In a specific example, for travelers with a “fear of flying” or motion sickness, such systems and methods could lead to a more pleasurable flying experience. The acceleration data can be sourced from sensors associated with the vehicle or from a local source. | 02-18-2016 |
20160048031 | DISPLAY DEVICE AND DISPLAYING METHOD THEREOF - A display device includes a display panel displaying an image and having a first radius of curvature, a distance measuring unit measuring a user distance, the user distance being a distance between the display panel and a user, and a curvature radius changing unit receiving the measured user distance from the distance measuring unit, to change the first radius of curvature of the display panel to a second radius of curvature different from the first radius of curvature. | 02-18-2016 |
20160048177 | KEY INPUT USING AN ACTIVE PIXEL CAMERA - In an example embodiment, an active pixel sensor on a user device is utilized to capture graphical user interface navigation related movements by a user. Areas of low luminance can be identified and movements or alterations in the areas of low luminance can be translated into navigation commands fed to an application running on the user device. | 02-18-2016 |
20160048190 | METHOD AND APPARATUS FOR CONTROLLING PERFORMANCE OF ELECTRONIC DEVICE - A method for controlling performance of an electronic device is provided. The method includes sensing user input, predicting user input speed, and controlling at least one processing unit of the electronic device based on a predicted user input speed and performance assignment information. Here, the performance assignment information includes control information mapped respectively with user input speeds for controlling the at least one processing unit of the electronic device. | 02-18-2016 |
20160048201 | INFORMATION PROCESSING METHOD AND ELECTRONIC DEVICE - The present disclosure provides an information processing method and an electronic device. The electronic device has a display unit, and is capable of generating a deformation in response to a stress. The display unit of the electronic device is capable of presenting M window interfaces, each of the M window interfaces being used for displaying a separate display content. The method comprises: determining a first display sub-region of the display unit which is in a presentation state, when it is judged that the electronic device has generated a predetermined deformation; obtaining a first attribute parameter of a first window interface among the M window interfaces; and displaying the first window interface in the first display sub-region if the first attribute parameter satisfies a preset condition. | 02-18-2016 |
20160048202 | DEVICE PARAMETER ADJUSTMENT USING DISTANCE-BASED OBJECT RECOGNITION - Methods, systems, and devices are described that provide for parameter adjustment for devices. A device may register a user profile associated with a user. During registration an image of the user may be captured and analyzed, such as to determine a size of facial features for the user. Also, during registration, metrics may be input or determined such as the age of the user or any sensory sensitivities. The device may capture a second image of the user. The second image may be compared to the first in order to determine the current distance between the user's face and the device. Comparing the first and second image may include comparing the size of corresponding facial features in each image, such as measured in pixels. Multiple images may be used for the first or second images. Based on contextual conditions and the distance, the device may adjust a sensory-related parameter. | 02-18-2016 |
20160048204 | GAZE SWIPE SELECTION - Methods for enabling hands-free selection of virtual objects are described. In some embodiments, a gaze swipe gesture may be used to select a virtual object. The gaze swipe gesture may involve an end user of a head-mounted display device (HMD) performing head movements that are tracked by the HMD to detect whether a virtual pointer controlled by the end user has swiped across two or more edges of the virtual object. In some cases, the gaze swipe gesture may comprise the end user using their head movements to move the virtual pointer through two edges of the virtual object while the end user gazes at the virtual object. In response to detecting the gaze swipe gesture, the HMD may determine a second virtual object to be displayed on the HMD based on a speed of the gaze swipe gesture and a size of the virtual object. | 02-18-2016 |
20160048205 | Sensor Proximity Glove for Control of Electronic Devices - The present invention relates to a device for interacting with computerized devices. In particular, the device comprises a wearable glove having a plurality of proximity sensors that can send information to, and receive information from, a computing device for the purpose of executing tasks thereon. | 02-18-2016 |
20160048206 | Display System and Diagnostic Method - The present invention provides a display system and a diagnostic method. The display system comprises an information preprocessing module, a display panel and a control circuit electrically connected to the display panel, the information preprocessing module is used for starting the control circuit and the display panel, the control circuit is used for controlling the display panel to display an image, the information preprocessing module comprises a sensor used for collecting physiological information of human body; the information preprocessing module further comprises a preprocessing circuit used for preprocessing the physiological information of human body collected by the sensor and transmitting the preprocessed physiological information of human body to the control circuit; the control circuit is used for receiving and diagnosing based on the preprocessed physiological information of human body to obtain a diagnosis result; and the display panel is used for displaying the diagnosis result. | 02-18-2016 |
20160048212 | Display Device and Control Method Thereof, and Gesture Recognition Method - The present invention provides a display device and a control method thereof, and a gesture recognition method. The control method comprises steps of: displaying a control picture by a display unit, converting the control picture into a virtual 3D control picture and providing the virtual 3D control picture to a user, wherein a distance between the virtual 3D control picture and eyes of the user is a first distance, and the first distance is less than a distance between the display unit and the eyes of the user; acquiring, by the image acquisition unit, an image of action of touching the virtual 3D control picture by the user; judging a touch position of the user in the virtual 3D control picture according to the image acquired by the image acquisition unit, and sending a control instruction corresponding to the touch position to a corresponding execution unit. | 02-18-2016 |
20160048214 | USING DISTANCE BETWEEN OBJECTS IN TOUCHLESS GESTURAL INTERFACES - A function of a device, such as volume, may be controlled using a combination of gesture recognition and an interpolation scheme. Distance between two objects such as a user's hands may be determined at a first time point and a second time point. The difference between the distances calculated at two time points may be mapped onto a plot of determined difference versus a value of the function to set the function of a device to the mapped value. | 02-18-2016 |
20160048215 | METHOD AND APPARATUS FOR PROCESSING USER INPUT - A user input processing method is provided. The user input processing method determines, based on a recognition reliability of a user input for a function, a delay time and whether the function is to be performed, the function being determined in advance, and controls the function based on the delay time and whether the function is to be performed. | 02-18-2016 |
20160048216 | METHODS FOR CAMERA MOVEMENT COMPENSATION FOR GESTURE DETECTION AND OBJECT RECOGNITION - Methods and systems for camera, movement compensation for gesture detection and object recognition. In some examples, the methods and systems analyze motion data associated with the sequential frames of a video stream, and reject those frames where the detected camera movement exceeds a predetermined threshold. In other examples, the methods and systems use motion data and portions of immediate previous frames to adjust a frame where the detected camera movement exceeds the predetermined threshold to create an adjusted frame that compensates for the detected camera movement. In still other examples, frames are adjusted if the detected motion exceeds a first threshold, and rejected if the detected motion exceeds a second, higher threshold. | 02-18-2016 |
20160048222 | IMAGE CAPTURE MODES FOR DUAL SCREEN MODE - Methods and devices for selectively presenting a user interface during an image capture. More particularly, the method includes a change in the display mode of a multiple screen device for a first screen and a second screen when capturing the image. The change in the display mode may be made in response to one or more actions conducted by a user. The image capture function includes different modes that display different information on the first and/or second screens. | 02-18-2016 |
20160048364 | CONTENT VISIBILITY MANAGEMENT - One embodiment provides a method, including: outputting, to a display device, first content; receiving, using a processor, an instruction to output second content to the display device; positioning one or more of the first content and the second content within the display device according to positioning data based on the first content; and displaying both of the first content and the second content on the display device. Other embodiments are described and claimed herein. | 02-18-2016 |
20160048725 | AUTOMOTIVE AND INDUSTRIAL MOTION SENSORY DEVICE - The technology disclosed relates to highly functional/highly accurate motion sensory control devices for use in automotive and industrial control systems capable of capturing and providing images to motion capture systems that detect gestures in a three dimensional (3D) sensory space. | 02-18-2016 |
20160050306 | PORTABLE TELEPHONE - In a portable telephone according to the present invention, a display displays a block indicative of an operator, predetermined information and a pointer; the operator can be operated in directions opposite to each other; and the controller controls the display so as to shift the pointer to a desirable position within a predetermined information on a screen of the display in accordance with an operation of the operator and also display a mark indicative of a direction to which the pointer can be shifted and in which the predetermined information exists, adjacently to the block along a shift direction through the operator. | 02-18-2016 |
20160054792 | Radar-Based Biometric Recognition - This document describes techniques and devices for radar-based biometric recognition. The techniques enable biometric recognition through a radar-based biometric-recognition system thereby permitting control of devices and applications with little or no active engagement from users. | 02-25-2016 |
20160054794 | EYE-CONTROL REMINDING METHOD, EYE-CONTROL IMAGE DISPLAY METHOD AND DISPLAY SYSTEM - An eye-control reminding method, an eye-control image display method and a display system are provided. The eye-control reminding method includes: detecting information related to an eye of a user in real time; comparing the information related to an eye of the user with a preset value; and determining eye-in-use status of the user according to the comparison result. When the comparison result is a set status, the user is reminded. Embodiments of the present invention may remind the user to protect eyes or focus attention, or can realize automatic adjustment of display contents and settings for viewing status of the user. | 02-25-2016 |
20160054796 | DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A display apparatus including: a display configured to include a curved display surface; a processor configured to process an image signal to be displayed as an image on the display; a detector configured to detect a position of a user; and a controller configured to determine a blind region, which cannot be seen by the user at a current position of the user, within the curved display surface based on the current position of the user detected by the detector, and to control the processor to adjust a display state of the image displayed on the curved display surface in accordance with the determined blind region. | 02-25-2016 |
20160054798 | Glove Interface Object - A glove interface object is provided, comprising: at least one flex sensor configured to generate flex sensor data identifying a flex of at least one finger portion of the glove interface object; at least one contact sensor configured to generate contact sensor data identifying a contact between a first portion of the glove interface object and a second portion of the glove interface object; a communications module configured to transmit the flex sensor data and the contact sensor data to a computing device for processing to determine a finger position pose of the glove interface object, the finger position pose being applied for rendering a virtual hand in a view of a virtual environment on a head-mounted display (HMD), the virtual hand being rendered based on the identified finger position pose. | 02-25-2016 |
20160054800 | INTEGRATED HAPTIC FEEDBACK SIMULATING DEVICE USING KINESTHESIA PROVIDING MODULE INCLUDING MAGNETORHEOLOGICAL FLUID AND THIN-FILM-TYPE TACTILE SENSATION PROVIDING MODULE - An integrated haptic feedback simulating device using a kinesthesia providing module including a magnetorheological fluid and a thin-film-type tactile sensation providing module. The integrated haptic feedback simulating device includes a motion controlling section providing kinesthetic feedback and tactile feedback to a hand of a user, a system controlling section detecting motions of the hand and providing an integrated haptic feedback control signal to the motion controlling section, and a display section visually rendering a graphic object according to the detected motions of the hand. The integrated haptic feedback simulating device can provide synesthetic haptic feedback to the user in cooperation with a graphic interface displayed on the display section, thereby increasing the virtuality of the graphic object simulated on the display section. | 02-25-2016 |
20160054801 | PORTABLE ELECTRONIC DEVICE AND MESSAGE PROCESSING METHOD THEREFOR - A message processing method for a portable electronic device includes the steps of: detecting, by a motion sensor, a plurality of first knocks of an object on the portable electronic device and generating a plurality of first detecting signals according to the plurality of first knocks; and encoding, by an encoding system, the plurality of first detecting signals to generate a first message code, wherein a first message is decodable from the first message code by a decoding system. | 02-25-2016 |
20160054803 | Occluded Gesture Recognition - This document describes techniques and devices for occluded gesture recognition. Through use of the techniques and devices described herein, users may control their devices even when a user's gesture is occluded by some material between the user's hands and the device itself. Thus, the techniques enable users to control their mobile devices in many situations in which control is desired but conventional techniques do permit effective control, such as when a user's mobile computing device is occluded by being in a purse, bag, pocket, or even in another room. | 02-25-2016 |
20160054804 | DEVICES, SYSTEMS, AND METHODS FOR DETECTING GESTURES USING WIRELESS COMMUNICATION SIGNALS - Examples of systems, devices, and methods are described herein that may provide for gesture recognition. Wireless communication signals may be received from sources in an environment (e.g. cellular telephones, computers, etc.). Features of the wireless communication signals (e.g. Doppler shifts) may be extracted and utilized to identify gestures. The use of wireless communication signals may accordingly make possible gesture recognition in a whole-home environment that may identify gestures performed through walls or other obstacles. | 02-25-2016 |
20160054806 | DATA PROCESSING APPARATUS, DATA PROCESSING SYSTEM, CONTROL METHOD FOR DATA PROCESSING APPARATUS, AND STORAGE MEDIUM - A data processing apparatus acquires a range image and identifies an operator who has performed a gesture operation on the operation screen based on the acquired range image, performs control so as to validate, when the identified operator is the first operator, a gesture operation on a first operation item in the operation screen, and to invalidate, when the identified operator is the second operator, a gesture operation on the first operation item, and so as to invalidate, when the identified operator is the first operator, a gesture operation on a second operation item in the operation screen, and to validate, when the identified operator is the second operator, a gesture operation on the second operation item. | 02-25-2016 |
20160054808 | METHOD AND DEVICE FOR EXECUTING COMMAND ON BASIS OF CONTEXT AWARENESS - A terminal device includes: a sensing unit to generate a motion change information by detecting motions of the terminal device and sensing a motion change of the terminal device; a gesture determiner to compare the motion change information with pre-specified gestures and generate a particular gesture information corresponding to the sensed motion change of the terminal device; an application controller to identify at least one activated application in an active state; and an execution unit to execute preset commands corresponding to the particular gesture information for respective activated applications. | 02-25-2016 |
20160054812 | APPARATUS AND METHOD OF RECOGNIZING MOVEMENT OF SUBJECT - Provided is an apparatus and method of recognizing a movement of a subject. The apparatus includes a light source configured to emit light to the subject and an image sensor configured to receive light reflected from the subject. The apparatus includes a processor configured to detect a pixel that is receiving the reflected light, the pixel being included in a pixel array of the image sensor. The processor is configured to track the movement of the subject based on a change in a position of the detected pixel. | 02-25-2016 |
20160054977 | SYSTEMS AND METHODS WHICH JOINTLY PROCESS MOTION AND AUDIO DATA - Motion and audio data associated with an area or a user are sensed and processed jointly to achieve improved results as compared to utilizing only the motion or the audio data by themselves. Synergies between motion and audio are identified and exploited in devices ranging from cell phones to activity trackers to home entertainment and alarm systems. | 02-25-2016 |
20160062116 | WEARABLE DISPLAY DEVICE AND METHOD OF CONTROLLING THEREFOR - A method of controlling a wearable display device according to one embodiment of the present specification can provide an optimized display in a manner of evaluating a front visual field of a wearer and a visual field of the wearer for a reference display area. And, in evaluating the front visual field of the wearer, it may provide the wearable display device capable of providing much more information as well as safer information in consideration of the reference display area. | 03-03-2016 |
20160062453 | MOTION TRACKING USER INTERFACE - A method to transition focus of a display corresponding to an object's motion tracked by a video camera or like device is disclosed. In one implementation, the display shows one or more windows or user interfaces on the display. The object's motion can be used to select one of the windows or user interfaces on the display and manipulate content presented in the window or user interface. In another implementations, the object's motion can manipulate a three-dimensional graphical icon in a three-dimensional display environment, for example, by rotating it. In another implementation, the method further tracks motion of a second object and shifts focus of the display corresponding to the motion of the second object. In another implementation, a second display may be added to mirror the focus transition corresponding to the object's motion. | 03-03-2016 |
20160062457 | DISPLAY DEVICE, METHOD OF CONTROLLING THE SAME, AND COMPUTER PROGRAM - A display device includes a display unit which is able to take a first position which is a relative position with respect to user's eyes and a second position which is a relative position with respect to the eyes and is different from the first position. The display device further includes a display mode change unit which, when the display unit displaying an image at the first position is moved to the second position, changes the mode of the display continued on the display unit. | 03-03-2016 |
20160062458 | GAZE BASED TEXT INPUT SYSTEMS AND METHODS - According to the invention, a method for entering text into a computing device using gaze input from a user is disclosed. The method may include causing a display device to display a visual representation of a plurality of letters. The method may also include receiving gaze information identifying a movement of the user's gaze on the visual representation. The method may further include recording an observation sequence of one or more observation events that occur during the movement of the user's gaze on the visual representation. The method may additionally include providing the observation sequence to a decoder module. The decoder module may determine at least one word from the observation sequence representing an estimate of an intended text of the user. | 03-03-2016 |
20160062465 | Semantic Framework for Variable Haptic Output - Methods and apparatus organize a plurality of haptic output variations into a cohesive semantic framework that uses various information about the alert condition and trigger, application context, and other conditions to provide a system of haptic outputs that share characteristics between related events. In some embodiments, an event class or application class provides the basis for a corresponding haptic output. In some embodiments, whether an alert-salience setting is on provides the basis for adding an increased salience haptic output to the standard haptic output for the alert. In some embodiments, consistent haptics provide for branding of the associated application class, application, and/or context. | 03-03-2016 |
20160062468 | Gesture Processing Using a Domain-Specific Gesture Language - The claimed subject matter includes techniques for processing gestures. An example method includes receiving a gesture from an application. The gesture includes one or more primitives from a language that is domain-specific to gestures. The method also further includes receiving skeletal data from a motion detection system. The method also includes comparing the skeletal data with the gesture from the application in a runtime module. The method also further includes sending a gesture event to the application. | 03-03-2016 |
20160062469 | SYSTEM AND METHOD FOR SELECTIVE GESTURE INTERACTION - A system and method for selective gesture interaction using spatial volumes is disclosed. The method includes processing data frames that each includes one or more body point locations of a collaborating user that is interfacing with an application at each time intervals, defining a spatial volume for each collaborating user based on the processed data frames, detecting a gesture performed by a first collaborating user based on the processed data frames, determining the gesture to be an input gesture performed by the first collaborating user in a first spatial volume, interpreting the input gesture based on a context of the first spatial volume that includes a role of the first collaborating user, a phase of the application, and an intersection volume between the first spatial volume and a second spatial volume for a second collaborating user, and providing an input command to the application based on the interpreted input gesture. | 03-03-2016 |
20160062470 | INSTRUMENT INTERFACE FOR REDUCING EFFECTS OF ERRATIC MOTION - Apparatus and methods to reduce effects of erratic motion input during operation of an instrument via a graphical user interface are described. Spatial motion input received from a user may be filtered using filter parameters obtained during a calibration procedure. The filtered motion input may be used to predict a trajectory of a cursor or object used to select an icon or text. The icon or text may be latched to the approaching cursor or object. The combination of motion smoothing and latching may improve ease-of-use of the graphical user interface for individuals having neuromuscular disorder, or users operating instruments in high-vibration environments. | 03-03-2016 |
20160062471 | GESTURE CONTROL SYSTEM CAPABLE OF INTERACTING WITH 3D IMAGES - A gesture control system capable of interacting with three-dimensional (3D) images comprises a 3D image display device, a gesture recognition device, and a data processing unit. The 3D image display device displays a 3D image. The gesture recognition device captures a hand(s) and finger(s) image of a user and then calculates a hand(s) and finger(s) coordinate. The hand(s) and finger(s) coordinate is transmitted to the data processing unit and is matched with an image 3-dimensional spatial coordinate for being output via the 3D image display device in order to achieve advantages of interacting with 3D images. | 03-03-2016 |
20160062473 | GESTURE-CONTROLLED COMPUTER SYSTEM - A gesture-controlled computer system includes a mobile computing device, a range camera, and a gesture-control subsystem. The gesture-control subsystem is in communication with the mobile computing device and the range camera. The gesture-control subsystem includes a gesture-capturing module that is configured for acquiring a first-taken range image of an articulating-gesture-controller. The gesture-control subsystem also includes a range-image-analysis module configured for generating a gesture-data-point profile derived at least in part from the first-taken range image. The gesture-data-point profile includes information regarding a plurality of gesture data points, including information about the relative position of the gesture data points. The gesture-control subsystem also includes a gesture-recognition module configured for (i) receiving the gesture-data-point profile from the range-image-analysis module, (ii) mapping the gesture-data-point profile to a corresponding device command, and (iii) transmitting the corresponding device command to the mobile device for execution of the device command by the mobile device. | 03-03-2016 |
20160062480 | INPUT DEVICE AND TOUCH PANEL DISPLAY SYSTEM - An input device with a plurality of terminals for inputting information in a touch panel includes three direction detecting contacts that are projecting portions to touch three positions on the touch panel and form a triangle having vertexes on the three positions, and at least one information identifying contact that touches the touch panel together with the three direction detecting contacts so that specific information to be displayed on a display panel is identified according to a placement position of the information identifying contact with respect to the triangle. | 03-03-2016 |
20160062481 | ELECTRONIC EQUIPMENT DISPLAYING VARIOUS KINDS OF INFORMATION AVAILABLE BY WEARING ON BODY - Provided is an electronic equipment that displays information in response to a variety of display requests of a user. The electronic equipment of the present disclosure is worn on the body. A communication circuit acquires various kinds of information from a various-information database in which the various kinds of information are stored via a network. A display circuit displays the various kinds of information. An imaging circuit images a real space. An object recognition circuit recognizes an object from an image imaged by the imaging circuit. A display request determination circuit determines a display request based on a combination of recognized different objects. Further, the display request determination circuit makes a transmission request for information to the various-information database in response to the display request through the communication circuit. | 03-03-2016 |
20160062483 | SYSTEMS AND METHODS FOR PROVIDING FUNCTIONALITY BASED ON DEVICE ORIENTATION - Systems, methods, and non-transitory computer-readable media can determine a first orientation in which a computing system is positioned. A first functionality can be provided when the computing system is positioned in the first orientation. It can be determined that the computing system becomes positioned in a second orientation. A second functionality can be selected, out of a set of functionalities, based on a current state associated with the computing system. The second functionality can be provided when the computing system becomes positioned in the second orientation. | 03-03-2016 |
20160062484 | INFORMATION PROCESSOR - An information processor includes a first housing including a display, a second housing including an input device, a connection terminal combined with the first housing so as to be detachable, and a hinge which couples the connection terminal to the second housing, and transforms a mode into an open state, a closed state, a standing state, or a tablet state. The display and the input device are operable in the open state. The display and the input device are inoperable in the closed state. The display and the input device are operable in the standing state. The display is operable, and the input device is inoperable in the tablet state. | 03-03-2016 |
20160062485 | ELECTRONIC DEVICE - An electronic device capable of performing display control based on bending is provided. The electronic device has a display, a detecting unit, and a controller. The controller has flexibility. The detecting unit is able to detect bending of the display. In a case where bending is detected by the detecting unit, the controller performs display control based on the bending. | 03-03-2016 |
20160062486 | MOBILE DEVICE AND METHOD OF PROJECTING IMAGE BY USING THE MOBILE DEVICE - A mobile device and a method of projecting an image by using the mobile device are provided. The method includes determining, as a projection image, an image area displayed on the mobile device, from among an entire image expressing content being replayed or accessed on the mobile device, projecting the projection image and a pointer disposed on the projection image, and changing the projection image in the entire image by detecting movement of the mobile device. | 03-03-2016 |
20160063919 | WEARABLE ELECTRONIC DEVICE - A display method for a display device is provided. The display method comprises activating a Head Mounted Theater (HMT) mode, the HMT mode displaying two images which are substantially same each other on first and second areas respectively, separated from each other, in a display area of the display device; and adjusting a display time, for displaying a black screen in the first and second areas, of a unit frame time when the HMT mode is activated. | 03-03-2016 |
20160067596 | SMART ARTICLES INCLUDING A VISUAL CODE AND A TOUCH CODE - Articles including a visual code and a touch code are described. In some embodiments, the articles are smart lottery tickets. | 03-10-2016 |
20160070337 | DISPLAY DEVICE ADJUSTMENT BY CONTROL DEVICE - A method, performed in a control device, for adjusting a display device in communication with the control device is disclosed. The method may include receiving an image of a scene including the display device from an image sensor configured with a field of view overlapping at least a portion of view of a user. A display condition of the display device may be detected from the image. The control device may determine context information associated with the user from the image. Based on the display condition and the context information associated with the user, an output display condition of the display device may be determined. A control signal for adjusting the display device may then be generated based on the output display condition. | 03-10-2016 |
20160070339 | Divice-based activity classification using predictive feature analysis - Device-based activity classification using predictive feature analysis is described, including evaluating an indicator associated with a predictive feature, identifying an application, using the name, to be performed, and invoking the application, the application being configured to interpret the indicator to determine an operation to perform at one or more levels of a protocol stack using data generated from evaluating a signal detected by a sensor, the sensor being coupled to a wearable device, and the application being configured to perform the operation using other data generated from evaluating another signal detected by another sensor, the another sensor being substantially different than the sensor. | 03-10-2016 |
20160070340 | ELECTRONIC DEVICE AND METHOD FOR AUTOMATICALLY ADJUSTING DISPLAY RATIO OF USER INTERFACE - An electronic device and a method for automatically adjusting display ratio of user interface are provided. The electronic device includes storage device, a proximity sensor and a touch screen. The method includes the following blocks. The proximity sensor is controlled to measure a distance between the display screen and a user. The distance measured by the proximity sensor is obtained. A display ratio of the user interface is calculated according to the obtained distance and a algorithm which is pre-stored in the storage device. The display ratio of the user interface is adjusted using the calculated display ratio. | 03-10-2016 |
20160070341 | HUMAN-COMPUTER INTERACTION USING WEARABLE DEVICE - Embodiments of apparatus and methods for human-computer interaction are described. An apparatus for human-computer interaction may have one or more processors, multiple sensors to measure motion of a body part of a user, a communication module to communicate with a remote computing device, and an interpretation module to interpret the motion of the body part of the user to be associated with an indication of a user input to the remote computing device. The components may be encased in a body configured to be worn by the user. Other embodiments may be described and/or claimed. | 03-10-2016 |
20160070344 | GAZE-BASED SECURITY - Systems and methods for presenting actual data on a display device based on eye-tracking data. An eye-tracking engine receives sensed data from an eye-tracking device, determines a movement status of an eye based on the sensed data, and determines a display configuration based on the determined movement status. The display configuration is output on the display device and includes masking data when the determined movement status indicates the eye is in motion. | 03-10-2016 |
20160070348 | THREE DIMENSIONAL CONTEXTUAL FEEDBACK - A device to output two or more coordinated haptic effects, comprising, a first haptic effect generator to output a first haptic effect, a second haptic effect generator to output a second haptic effect and a processor to coordinate operation of the second haptic effect generator with operation of the first haptic effect generator based on an input provided to the processor. | 03-10-2016 |
20160070350 | SYSTEM AND METHOD FOR CONTROLLING AUDIO OUTPUT ASSOCIATED WITH HAPTIC EFFECTS - An apparatus, a processor-readable medium, and a method are provided that are configured to cause a haptic effect and an audio effect to be output substantially concurrently. The haptic effect has a frequency and the audio effect has a frequency different from the frequency of the haptic effect. At least one of the frequency of the haptic effect and the frequency of the audio effect is varied while maintaining substantially constant an average energy of the haptic effect. Varying the frequency of the audio effect can cause a perceived frequency of the haptic effect to change. | 03-10-2016 |
20160070354 | Context-Aware Activation of Camera and Flashlight Modules - A system and method for gesture control of a mobile communications device link a single user gesture to a plurality of device functions or modules. One or more condition sensors associated with the device are then employed by the mobile communications device to disambiguate the gesture such that the device may then activate one of the plurality of device functions or modules. In this way, the user controls the activation of multiple functions or modules via use of a single gesture. | 03-10-2016 |
20160070355 | OBTAINING METRICS FOR A POSITION USING FRAMES CLASSIFIED BY AN ASSOCIATIVE MEMORY - A method for identifying a motion of interest of an individual. The method includes collecting, at a computer, motion sensor input data of motions of the individual from a motion sensor for an interval of time. The method further includes analyzing, using the computer, the motion sensor input data using an analysis application having a set of classified predetermined motions of interest. The analysis application classifies a movement captured during the interval of time as a motion corresponding to one of a plurality of pre-determined motions of interest based on shared relative attributes. The method further includes generating an output providing notice of an identified predetermined motion of interest to a monitoring system. | 03-10-2016 |
20160070356 | PHYSICALLY INTERACTIVE MANIFESTATION OF A VOLUMETRIC SPACE - A “PiMovs System” provides a “physically interactive manifestation of a volumetric space” (i.e., PiMovs). The perimeter of a geometric framework is wrapped with contiguous display surfaces to cover each section of the perimeter with adjacent display surfaces. Additional contiguous display surfaces may cover top and/or bottom surfaces of the framework, with some edges of those display surfaces also adjacent edges of display surfaces on the perimeter. Sensors track positions and natural user interface (NUI) inputs of users within a predetermined zone around the framework. A contiguous volumetric projection is generated and displayed over the framework via the display surfaces as a seamless wrapping across each edge of each adjacent display surface. This volumetric projection is then automatically adapted to tracked user positions and NUI inputs. | 03-10-2016 |
20160070357 | Parametric Inertia and APIs - Parametric inertia and API techniques are described. In one or more implementations, functionality is exposed via an application programming interface by an operating system of a computing device to one or more applications that is configured to calculate an effect of inertia for movement in a user interface. The calculated effect of inertia for the movement on the user interface is managed by the operating system based on one or more rest points specified using one or more parametric curves by the one or more applications via interaction with the application programming interface. | 03-10-2016 |
20160070359 | METHOD AND APPARATUS FOR DISTINGUISHING FEATURES IN DATA - To distinguish a region (e.g. hand) within a data set (e.g. digital image), data elements (e.g. pixels) representing a transition (e.g. hand outline) are identified. The direction toward the region is determined, for example using weighted direction matrices yielding a numerical maximum when aligned inward. A test element displaced one or more steps inward from the boundary element is tested against a standard for identifying the region. If the tested element meets the standard, that element is identified as part of the region. By examining data elements away from the transition, noise in the transition itself is avoided without altering the transition (e.g. by smoothing) while still only examining a linear data set (i.e. a contour or trace of the feature rather than a flooded interior thereof). The direction to the exterior of the region, an exterior contour, other features, and/or the transition also may be identified/followed. | 03-10-2016 |
20160070360 | METHOD AND APPARATUS FOR DISTINGUISHING FEATURES IN DATA - To distinguish a region (e.g. hand) within a data set (e.g. digital image), data elements (e.g. pixels) representing a transition (e.g. hand outline) are identified. The direction toward the region is determined, for example using weighted direction matrices yielding a numerical maximum when aligned inward. A test element displaced one or more steps inward from the boundary element is tested against a standard for identifying the region. If the tested element meets the standard, that element is identified as part of the region. By examining data elements away from the transition, noise in the transition itself is avoided without altering the transition (e.g. by smoothing) while still only examining a linear data set (i.e. a contour or trace of the feature rather than a flooded interior thereof). The direction to the exterior of the region, an exterior contour, other features, and/or the transition also may be identified/followed. | 03-10-2016 |
20160070365 | VIRTUAL INPUT DEVICE SYSTEM - The described technology is directed towards virtual input devices that take application program-directed input from automation and/or remote devices, such as over a network, instead of via actual user input via a physical device, for example. This allows an automation framework to insert input into an application program, such as for automated testing without modifying any of the application's other components. The virtual input devices may be object instances or the like that receive their input from function calls based upon the type of input and output events, e.g., to simulate keyboard input/output (I/O), mouse or other pointer I/O, voice, gesture, and other command I/O, and so forth. | 03-10-2016 |
20160070366 | INTERACTIVE BOOK ELECTRONIC SYSTEM AND OPERATION METHOD THEREOF - Interactive electronic system having a screen and a three-axis compass sensor for detecting flipping of pages of a book, wherein the book comprises a plurality of pages, each having with one or more magnets, and wherein the system is arranged to detect the flipping of pages of the book by the data readings of the value of one axis, X, Y, or Z; or the magnitude of the calculated vector of the values of the axes X, Y, and Z. Also a method of interaction comprises: downloading the software of the book to the electronic device; putting the book which contains the sensors in a pre-determined area near the electronic device in manner to identify the pages; calibrating the system; obtaining the values of the magnitude of the magnetic field by the compass sensor to detect the page that is open; displaying the corresponding digital content in synchronization. | 03-10-2016 |
20160071235 | IMAGE PROCESSING DEVICE FOR DISPLAYING MOVING IMAGE AND IMAGE PROCESSING METHOD THEREOF - Methods and Apparatus provide for obtaining a data sequence representative of a three-dimensional parameter space; forming a plurality of coding units by dividing, in three dimensions, the data sequence subject; and generating, for each of the plurality of coding units: (i) a palette defined by two representative values, and (ii) a plurality of indices, each index representing a respective original data point as a value, determined by linear interpolation, to be one of, or an intermediate value between, the representative values, and setting the palette and the plurality of indices for each of the coding units as compressed data. | 03-10-2016 |
20160071241 | Landscape Springboard - An electronic device with a display showing a user interface (UI) in different orientations relative to the display. In landscape orientation the user interface includes a dock region displayed along the right or left edge of the user interface. Application icons or other UI objects that are arranged in a row on the dock region in portrait orientation are arranged in a column on the dock region in the landscape orientation. UI objects and folders from other pages move from underneath the dock region onto the user interface. Furthermore, notification and control windows are overlay on top of portions of the dock region in landscape orientation of the user interface. | 03-10-2016 |
20160073073 | PORTABLE TERMINAL AND METHOD OF CONTROLLING THE SAME - A portable terminal including a projector which projects a user interface (UI) onto an object, and a method of controlling the same. The portable terminal includes a display which displays a first UI and at least one projector which projects a second UI onto an object, and the at least one projector includes a light source which displays the second UI through a plurality of organic light-emitting diodes (OLEDs) and a lens which focuses light generated in the plurality of OLEDs and projects the light onto the object. | 03-10-2016 |
20160073699 | STYLISH ARTICLES OF CLOTHING - Embodiments provide fashionable articles of women's clothing incorporating electronic components. | 03-17-2016 |
20160077337 | Managing Information Display - An example method includes receiving, by a head-mountable device (HMD), data corresponding to an information event, and providing an indication corresponding to the information event in response to receiving the data. The method further includes determining a gaze direction of an eye and determining that the gaze direction of the eye is an upward direction that corresponds to a location of a display of the HMD. The display is located in an upper periphery of a forward-looking field of view of the eye when the HMD is worn. The method further includes, in response to determining that the gaze direction of the eye is the upward direction, displaying graphical content related to the information event in the display. | 03-17-2016 |
20160077583 | SWITCH OPERATING DEVICE, MOBILE DEVICE AND METHOD FOR OPERATING A SWITCH BY A PRESENCE OF A PART EMITTING HEAT - Switch operating device ( | 03-17-2016 |
20160077584 | DISPLAY METHOD AND ELECTRONIC DEVICE - The disclosure provides a display method and an electronic device. The display method includes: detecting whether display contents are to be displayed via a display unit of the electronic device, determining a first region corresponding to a gazing position of a user on the display unit, determining a second region on the display unit based on the first region, and displaying the display contents in the second region. | 03-17-2016 |
20160077586 | INFORMATION PROCESSING DEVICE THAT HAS FUNCTION TO DETECT LINE OF SIGHT OF USER - An information processing device includes: a display screen; and a processor that executes a process. The process includes; detecting a line of sight of a user, determining a first region on the display screen in which first information indicating an input method using a movement of the user is displayed, in accordance with a line of sight position on the display screen of the detected line of sight, determining a second region in which second information indicating a trace that corresponds to the movement of the user for an input from the user is displayed, the second region being a region other than the first region and including a point located farther from the line of sight position than the first region, displaying the first information in the first region, detecting the input from the user, and displaying the second information in the second region. | 03-17-2016 |
20160077587 | SMART RING - The description relates to a smart ring. In one example, the smart ring can be configured to be worn on a first segment of a finger of a user. The example smart ring can include at least one flexion sensor secured to the smart ring in a manner that can detect a distance between the at least one flexion sensor and a second segment of the finger. The example smart ring can also include an input component configured to analyze signals from the at least one flexion sensor to detect a pose of the finger. | 03-17-2016 |
20160077595 | PORTABLE DEVICE PAIRING WITH A TRACKING SYSTEM - In embodiments of portable device pairing with a tracking system, a pairing system includes a portable device that generates device acceleration gesture data responsive to a series of motion gestures of the portable device. The pairing system also includes a tracking system that is configured for pairing with the portable device. The tracking system recognizes the series of motion gestures of the portable device and generates tracked object position gesture data. A pairing service can then determine that the series of motion gestures of the portable device corresponds to the series of motion gestures recognized by the tracking system, and communicate a pairing match notification to both the tracking system and the portable device to establish the pairing. | 03-17-2016 |
20160077596 | METHODS, SYSTEMS, AND APPARATUSES TO DISPLAY VISIBILITY CHANGES RESPONSIVE TO USER GESTURES - In one embodiment, an electronic device to be worn on a user's forearm includes a display and a set of one or more sensors that provide sensor data. In one aspect, a device may detect, using sensor data obtained from a set of sensors, that a first activity state of a user is active. The device may determine, while the first activity state is active, that the sensor data matches a watch check rule associated with the first activity state. Responsive to the detected match, the device may cause a change in visibility of the display. | 03-17-2016 |
20160077597 | INPUT DEVICE AND METHOD FOR INPUTTING OPERATIONAL REQUEST - The present application discloses an input device including a sensor configured to track movement of a body part of an operator and generate movement data about the movement of the body part, a processor including an operation command generator configured to generate an operation command from the movement data, a speed data generator configured to generate speed data representing a speed of the movement from the movement data, and a feedback determination portion configured to determine whether a feedback operation to allow the operator to confirm the operation command is required based on the speed data, and an operation portion including a feedback operation device configured to execute the feedback operation if the feedback determination portion determines that the feedback operation is required. | 03-17-2016 |
20160077598 | CLIENT DEVICE MOTION CONTROL VIA A VIDEO FEED - An approach is described for identifying enabling motion control of a client device, such as a mobile device, via a video feed transmitted from one or more video capture devices. An associated method may include establishing, via a communications network, a communication session between a client device and one or more video capture devices. The method further may include identifying a user of the client device via the one or more video capture devices and negotiating parameters of a video feed for transmission from the one or more video capture devices to the client device via the communication session. The method further may include, upon transmission of the video feed from the one or more video capture devices to the client device, facilitating control of the client device in response to any device control gesture received from the user based upon the video feed. | 03-17-2016 |
20160077599 | USER INTERFACE WITH POSITION AWARENESS - A remote control device for controlling lighting systems includes a sensor configured to determine a location of the remote control device in relation to the lighting systems. A controller is configured to determine a nearest light source of the lighting systems relative to the location of the remote control device and to control this nearest light source. The controller is configured to change a configuration of the remote control device in response to changing its location. A transceiver transmit a signal to multiple light sources which measure the strength and/or time of flight of this signal for use in determining the location of the remote control device. The light sources provide the remote control device with identifying information unique to each one of them including their locations. | 03-17-2016 |
20160077603 | METHOD AND SYSTEM FOR UNIFIED INPUT IN CROSS-PLATFORM STREAMING APPLICATIONS IN CLOUD COMPUTING ENVIRONMENTS - A computer program, method, and system for cross-platform input data unification. According to some embodiments, a first input data is received from a first input device connected to a first platform. The input data is converted to a standard action. The standard action is transmitted to a second platform capable of determining a relationship between the standard action and a second input device associated with the second platform. The second platform can use the input data to trigger an action associated with the standard action in the application or gaming environment. | 03-17-2016 |
20160077606 | METHOD AND APPARATUS FOR PROVIDING LOCK-SCREEN - A method and an apparatus for providing a lock-screen are provided. The method includes turning off a display unit, configuring content for the lock-screen in response to the turning off of the display unit, receiving a user interaction, based on at least one button, turning on the display unit in response to the user interaction, and displaying the lock-screen that is changed based on the content configured to correspond to the user interaction when the display unit is turned on. | 03-17-2016 |
20160077607 | ELECTRONIC DEVICE AND METHOD OF CONTROLLING DISPLAY OF SCREEN THEREOF - A method of controlling a display of a screen is provided. The method includes selecting a first image and a second image, selecting a masking pattern including at least one first area and at least one second area, measuring an inclination of the electronic device, and determining a part of the first image displayed in a screen corresponding to the at least one first area and a part of the second image displayed in a screen corresponding to the at least one second area, based on the measured inclination of the electronic device, and outputting the determined parts of the first image and the second image on a display unit. | 03-17-2016 |
20160077608 | RECOGNITION DEVICE, RECOGNITION METHOD, AND NON-TRANSITORY RECORDING MEDIUM - A recognition device includes an acquisition unit and a processor. The acquisition unit acquires first and second informations. The first information relates to a first signal corresponding to a state of a first portion of a body performing an action. The first signal is generated by a first element mounted to the first portion. The second information relates to a second signal corresponding to a state of a second portion. The second signal is generated by a second element mounted to the second portion. A relative positional relationship between the first portion and the second portion changes according to the action. The processor calculates a first feature based on the first information and a second feature based on the second information. The processor recognizes a type of the action based on a change of the first feature and a change of the second feature. | 03-17-2016 |
20160077623 | FACILITATING USER INPUT VIA HEAD-MOUNTED DISPLAY DEVICE AND ARM-MOUNTED PERIPHERAL DEVICE - An apparatus for performing data entry by a user includes a first electronic device configured to be attached to a person's head and including a display for viewing by the person; and a second electronic device configured to be attached to a person's forearm and used in combination with the first electronic device. The first and second electronic devices are configured for wirelessly communications with each other, at least some of the wireless communications representing user input by the person for interfacing with a user interface displayed to the person on the display of the first electronic device, whereby data entry by the person is accomplished. The first electronic device is configured to wirelessly transmit data entered by the person to a computer system for electronic storage in a non-transitory computer readable medium. | 03-17-2016 |
20160077781 | TERMINAL APPARATUS AND OPERATION DESIGNATION METHOD - A terminal apparatus includes a position determination unit for determining a position of the terminal apparatus; an imaging unit for capturing an image; a direction determination unit for determining an imaging direction of the imaging unit; an identification unit for identifying an input/output apparatus captured by the imaging unit based on position information of the input/output apparatus, the position of the terminal apparatus, and the imaging direction of the imaging unit; an operation unit for accepting an operation of a user with respect to the image captured by the imaging unit including an image of the input/output apparatus displayed by a display unit; and a request processing unit for making a request to the input/output apparatus for an operation to be performed by the input/output apparatus based on the operation of the user with respect to the image captured by the imaging unit including the input/output apparatus. | 03-17-2016 |
20160078289 | Gesture Recognition Apparatuses, Methods and Systems for Human-Machine Interaction - The GESTURE RECOGNITION APPARATUSES, METHODS AND SYSTEMS FOR HUMAN-MACHINE INTERACTION (“GRA”) discloses vision-based gesture recognition. GRA can be implemented in any application involving tracking, detection and/or recognition of gestures or motion in general. Disclosed methods and systems consider a gestural vocabulary of a predefined number of user specified static and/or dynamic hand gestures that are mapped with a database to convey messages. In one implementation, the disclosed systems and methods support gesture recognition by detecting and tracking body parts, such as arms, hands and fingers, and by performing spatio-temporal segmentation and recognition of the set of predefined gestures, based on data acquired by an RGBD sensor. In one implementation, a model of the hand is employed to detect hand and finger candidates. At a higher level, hand posture models are defined and serve as building blocks to recognize gestures | 03-17-2016 |
20160078819 | LIQUID CRYSTAL DISPLAY DEVICE AND DRIVING METHOD THEREOF - A liquid crystal display panel comprises a backlight unit including an optical assembly and configured to differentiate the liquid crystal display panel into a plurality of blocks and to illuminate light to each of the plurality of blocks; a location sensor configured to sense a location of a user watching the liquid crystal display panel; a backlight controller configured to output a dimming value corresponding to a brightness of each of the plurality of blocks according to a result of sensing by the location sensor; and a backlight driver configured to generate a driving current corresponding to the dimming value of each block and to provide the generated driving current to the backlight unit corresponding to each of the plurality of blocks. | 03-17-2016 |
20160078844 | ELECTRONIC DEVICE AND METHOD FOR CONTROLLING DISPLAY DEVICE OF THE ELECTRONIC DEVICE - Method of controlling a display device of an electronic device includes activating a camera lens of the electronic device. The camera lens is controlled to capture images at predetermined time intervals. The display device is controlled according to the captured images. | 03-17-2016 |
20160080720 | DISPLAY WITH EYE-DISCOMFORT REDUCTION - A display system includes a display arranged in sight of a viewer, a sensory subsystem configured to sense an ocular condition of the viewer, and a controller. Operatively coupled to the display and to the sensory subsystem, the controller is configured to adjust an operating parameter of the display system in response to the ocular condition, in order to prevent or relieve eye discomfort experienced by the viewer. | 03-17-2016 |
20160085268 | Protective Cover and Display Position Detection for a Flexible Display Screen - An information handling system includes a flexible display screen, first and second display platforms, and a coating. The flexible display screen includes first and second surfaces, and has a first hardness level. The flexible display screen movably mounted to the first and second display platforms along the first surface. The coating is in physical communication with the second surface of the flexible display screen and includes a first region having a second hardness level greater than the first hardness level of the flexible display screen; and a second region having a third hardness level, the third hardness level being greater than the second harness level. | 03-24-2016 |
20160085286 | Electronic Apparatus And Display Control Method - An electronic apparatus and a display control method are described. The electronic apparatus includes a first display unit having a first visible part for displaying a first image; a first detecting unit for detecting a first parameter for indicating a relative distance between a target object and the first visible part; and a processing unit for generating an image to be displayed and for controlling the display of the first display unit according to at least the first parameter. When the first display unit is in a first state, if the relative distance is less than or equal to a threshold distance according to the first parameter, the processing unit controls the first display unit to switch from the first status to a second state, and the power consumption of the first display unit in the first state is lower than a power consumption in the second status. | 03-24-2016 |
20160085295 | METHODS AND SYSTEMS FOR CALIBRATING USER DEVICES - Methods and systems are described herein for a media guidance application that improves the customization and calibration of user devices to a particular user. For example, in response to erroneously detecting (or failing to detect) a user input of a first type, the media guidance application may re-calibrate the user device based on subsequent corrective inputs issued using a second input type such that future user inputs of the first type will not be erroneously detected (or fail to be detected). | 03-24-2016 |
20160085296 | WEARABLE INPUT DEVICE - Various systems and methods for a wearable input device are described herein. A textile-based wearable system for providing user input to a device comprises a first sensor integrated into the textile-based wearable system, the first sensor to produce a first distortion value representing a distortion of the first sensor. The system also includes an interface module to detect the first distortion value, the distortion value measured with respect to an initial position, and transmit the first distortion value to the device, the device having a user interface, the user interface to be modified, responsive to receiving the first distortion value. | 03-24-2016 |
20160085299 | FACILITATING DYNAMIC EYE TORSION-BASED EYE TRACKING ON COMPUTING DEVICES - A mechanism is described for facilitating eye torsion-based accurate eye tracking on computing devices according to one embodiment. A method of embodiments, as described herein, includes determining a head pose representing a tilt of a head of a person in an image captured by a capturing device of one or more capturing/sensing device. The image may illustrate one or more eyes of the person. The method may further include estimating a gravity vector relating to an eye of the one or more eyes based on data relating to the image and sensed by a sensing device of the one or more capturing/sensing devices, computing a total torsion angle based on one or more of the head pose and the gravity vector, and estimating a gaze vector associated with eye to facilitate tracking of the eye. The tracking may include positions or movements of the eye based on the gaze vector. | 03-24-2016 |
20160085301 | DISPLAY VISIBILITY BASED ON EYE CONVERGENCE - Gaze information of a user can be determined by a computing device that analyzes images of the user. Gaze information of a user includes information such as the user's line of sight, point of regard information, the direction of the user's gaze, the depth of convergence of the user's gaze, and the like. The computing device is able to estimate the distance from the user at which the user is focusing (for example, at a screen near the user or at an object farther away). The visibility and display characteristics of objects displayed on the HUD may be based on the gaze information. For example, content on a heads-up display (HUD) on a windshield may be more transparent while the user is looking through the windshield and more opaque (or otherwise enhanced) while the user is focusing on the HUD. | 03-24-2016 |
20160085308 | ORIENTATION ADJUSTABLE MULTI-CHANNEL HAPTIC DEVICE - A system that generates haptic effects on a haptically-enabled device determines an orientation of the haptically-enabled device and obtains one or more haptic effect channels. The system then assigns each of the haptic effect channels to a haptic output device on the haptically-enabled device based on the orientation. | 03-24-2016 |
20160085313 | ELECTRONIC DEVICE, CONTROL SYSTEM AND CONTROL METHOD FOR SMART GLASS - A control method for smart glass is provided. The control method includes: recognizing a gesture in response of an input signals; comparing the gesture with a number of predefined gestures stored in the electronic device; determining whether the gesture is similar to one of the number of predefined gestures; obtaining the one of the number of predefined gestures when the gesture is similar to the one of the number of predefined gestures; submitting an instruction containing the one of the number of predefined gestures to a smart glass; analyzing the instruction to get the one of the number of predefined gestures; obtaining a control order matched with the one of the number of predefined gestures stored in the smart glass; and controlling a display unit of the smart glass to refresh contents according to the control order. | 03-24-2016 |
20160085317 | METHODS AND SYSTEMS FOR RECALIBRATING A USER DEVICE - Methods and systems are described herein for a media guidance application that enhances the precision of various types of user input interfaces. For example, the media guidance application may recalibrate a user input interface such that the user inputs are correctly received and executed. Furthermore, to further enhance precision, the media guidance application may base the recalibrations on the age of a user. | 03-24-2016 |
20160085319 | MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME - A mobile terminal is provided that is capable of effectively displaying information to users by utilizing characteristics of a flexible display unit. The mobile terminal includes a flexible display unit configured to display image information on an entire screen; a sensing unit configured to sense a folded state of the flexible display unit; and a control unit configured to divide an entire screen of the flexible display unit into a plurality of screens based on a folded position, rearrange image information according to the plurality of divided screens, and display the rearranged image information on at least one of the plurality of divided screens. | 03-24-2016 |
20160085320 | Motion-Based View Scrolling System with Proportional and Dynamic Modes - The present invention provides a system and methods for motion-based scrolling of a relatively large contents view on an electronic device with a relatively small screen display. The user controls the scrolling by changing the device's tilt relative to a baseline tilt. The scrolling control can follow a Proportional Scroll mode, where the relative tilt directly controls the screen position over the contents view, or a Dynamic Scroll mode where the relative tilt controls the scrolling speed. The present invention obtains a criterion for automatically selecting the best scrolling mode when the dimensions of the contents view change. The baseline tilt is updated when the screen reaches an edge of the contents view to eliminate the creation of a non responsive range of tilt changes when the user changes tilt direction. | 03-24-2016 |
20160085504 | Electronic Apparatus and Method for Controlling Response - An electronic apparatus is described and includes a first body; a second body; and at least two connecting devices, wherein the first body is connected with the second body in a slidable manner through the at least two connecting devices. The first body slides to a first state relative to the second body through the at least two connecting devices, and the first body slides to a second state relative to the second body through the at least two connecting devices. The first state is a first limit position of the sliding distance provided by the at least two connecting devices that the first body slides relative to the second body, and the second state is a second limit position of the sliding distance provided by the at least two connecting devices that the first body slides relative to the second body. | 03-24-2016 |
20160086570 | SINGLE USER INPUT MECHANISM FOR CONTROLLING ELECTRONIC DEVICE OPERATIONS - A unique input mechanism for controlling several operations of an electronic device is provided. Using the unique input mechanism, which may be the single input mechanism for providing user inputs to the electronic device, a user may provide different inputs or combinations of inputs to control different operations based on the current mode or capacity of the electronic device. For example, a single, short click of a button may control a media operation (e.g., play/pause) in a media mode, and the same input may control a telephony operation (e.g., initiate/terminate call) in a telephony mode. In some embodiments, different inputs may be associated with different types of operations. The unique input mechanism may include, for example, a button, a switch, a key, or an actuator. | 03-24-2016 |
20160089980 | DISPLAY CONTROL APPARATUS - A display control apparatus configured to control a display state of display objects on a display screen, is provided with: a detecting device configured to detect a trigger operation of a user and a gaze area of the user on the display screen, on the basis of an imaging result of an imaging device that images the user; a determining device configured to determine specified display objects corresponding to the detected trigger operation or the detected gaze area from among a plurality of display objects displayed on the display screen in a first display state; and a controlling device configured to control the display state to be a second display state in which the display state of the determined specified display objects is different from that in the first display state if the trigger operation is detected in the first display state. | 03-31-2016 |
20160090103 | VEHICLE INTERFACE INPUT RECEIVING METHOD - A vehicle interface input receiving method comprises operating a controller to selectively enter or refrain from entering a user input device controlling mode. While in the user input device controlling mode, the method determines a presence of first and second contact conditions at first and second predetermined locations, respectively, with the first contact condition being on a vehicle steering wheel, and controls a user input device to change from a deactivated state to an activated state in response to the simultaneous existence of the first and second contact conditions. Also, an indicator provides an indication of the activated state, and the user input device is maintained in the activated state while the first contact condition continues to exist. The user input device provides input to the controller in the activated state and refrains from providing such input in the deactivated state, and afterward returns to the deactivated state. | 03-31-2016 |
20160091965 | NATURAL MOTION-BASED CONTROL VIA WEARABLE AND MOBILE DEVICES - A “Natural Motion Controller” identifies various motions of one or more parts of a user's body to interact with electronic devices, thereby enabling various natural user interface (NUI) scenarios. The Natural Motion Controller constructs composite motion recognition windows by concatenating an adjustable number of sequential periods of inertial sensor data received from a plurality of separate sets of inertial sensors. Each of these separate sets of inertial sensors are coupled to, or otherwise provide sensor data relating to, a separate user worn, carried, or held mobile computing device. Each composite motion recognition window is then passed to a motion recognition model trained by one or more machine-based deep learning processes. This motion recognition model is then applied to the composite motion recognition windows to identify a sequence of one or more predefined motions. Identified motions are then used as the basis for triggering execution of one or more application commands. | 03-31-2016 |
20160091967 | Eye Gaze for Spoken Language Understanding in Multi-Modal Conversational Interactions - Improving accuracy in understanding and/or resolving references to visual elements in a visual context associated with a computerized conversational system is described. Techniques described herein leverage gaze input with gestures and/or speech input to improve spoken language understanding in computerized conversational systems. Leveraging gaze input and speech input improves spoken language understanding in conversational systems by improving the accuracy by which the system can resolve references—or interpret a user's intent—with respect to visual elements in a visual context. In at least one example, the techniques herein describe tracking gaze to generate gaze input, recognizing speech input, and extracting gaze features and lexical features from the user input. Based at least in part on the gaze features and lexical features, user utterances directed to visual elements in a visual context can be resolved. | 03-31-2016 |
20160091968 | INTERACTING WITH A DISPLAY POSITIONING SYSTEM - Display repositioning is provided. A viewer set is detected based, at least in part, on sensory data received from one or more sensors, wherein the viewer set includes one or more viewers of a display device. A location of the viewer set is determined relative to a screen of the display device based, at least in part, on the sensory data. A secondary interface element is presented via the screen, wherein the secondary interface element has a location based, at least in part, on the location of the viewer set and wherein the secondary interface element has a size based, at least in part, on the location of the viewer set. | 03-31-2016 |
20160091969 | Electronic Apparatus And Display Method - An electronic apparatus includes a main body, a fixing body, and a first display unit provided on the main body and/or the fixing body, wherein the first display unit has a first visible area and when the eyes of the viewer are a first distance away from the first visible area, a first image area of the image is perceived by the viewer, and when the eyes of the viewer are a second distance away from the first visible area, a second image area of the image is perceived by the viewer. The second image area includes the first image area and the first distance is larger than the second distance. A first content in the first image is displayed in the first image area, wherein the first content is perceived by the viewer when eyes of the viewer are the first distance away from the first visible area. | 03-31-2016 |
20160091970 | HEAD-MOUNTED DISPLAY APPARATUSES - A head-mounted display apparatus is disclosed, the apparatus includes a reflective microdisplay, a visible light source, an illumination optics unit, an imaging optics unit and an eye tracker module which includes an invisible light source and a sensor. The invisible light source emanates an invisible light beam which is subsequently received by the imaging optics unit and directed thereby into an eye of a user. The sensor receives the invisible light beam reflected back from the eye of the user and thereby captures an image of the eye, on the basis of which, a position of the eye is determinable by calculation. The apparatus has the advantage of an improvement in the accuracy of the object tracking and does not have influence on the user at all. | 03-31-2016 |
20160091974 | Active Element Display Reorientation Method and Device - A method of repositioning of screen elements from a first orientation to a second orientation with rendering one or more active screen elements on the display area in the first orientation. In one embodiment, upon detection to a second orientation, the active screen elements are re-rendered on the display area to be displayed in the new orientation while the direction of movement of animation remains in the same direction relative to the display area. In another embodiment, the method may consist of maintaining a list of screen elements and a list of graphical features including a directionality indicator where the directionality indicator points in a first direction in a first orientation and the same first direction in a second orientation. Re-rendering may be accomplished using a subset of HTML5 and a subset of CSS3 or its successors without having to use code native to the device containing the display area. | 03-31-2016 |
20160091976 | DYNAMIC HAND-GESTURE-BASED REGION OF INTEREST LOCALIZATION - A method, non-transitory computer-readable medium, and apparatus for localizing a region of interest using a dynamic hand gesture are disclosed. For example, the method captures the ego-centric video containing the dynamic hand gesture, analyzes a frame of the ego-centric video to detect pixels that correspond to a fingertip using a hand segmentation algorithm, analyzes temporally one or more frames of the ego-centric video to compute a path of the fingertip in the dynamic hand gesture, localizes the region of interest based on the path of the fingertip in the dynamic hand gesture and performs an action based on an object in the region of interest. | 03-31-2016 |
20160091978 | GESTURE RECOGNITION APPARATUS, VEHICLE HAVING THE SAME AND METHOD FOR CONTROLLING THE SAME - A vehicle includes a collector for collecting information about a subject, comprising a first detector for detecting movement of the collector; and a gesture recognition apparatus for recognizing a gesture based on the information on the subject. The gesture recognition apparatus includes a storage unit for storing an operating instruction for an electronic device which corresponds to the gesture information; and a controller for correcting the recognized gesture information based on information on the movement of the collector, and determining an operating instruction for an electronic device which corresponds to the corrected gesture information. | 03-31-2016 |
20160091979 | INTERACTIVE DISPLAYING METHOD, CONTROL METHOD AND SYSTEM FOR ACHIEVING DISPLAYING OF A HOLOGRAPHIC IMAGE - An interactive displaying method, a control method and an apparatus for achieving displaying of a holographic image are provided. The interactive displaying method comprises: scanning a 3D space by controlling signal transmitters arranged in a matrix to transmit signals; determining a target object in the 3D space; and determining a first position of the target object and displaying a viewable image corresponding to the target object at a second position of a display region. The control method comprises: scanning a 3D space by controlling signal transmitters arranged in a matrix to transmit signals; determining a gesture operation of a target object; determining the gesture; and executing a control operation corresponding to the gesture. The apparatus comprises a display unit, a signal detecting unit, a position determining unit and a display control unit. | 03-31-2016 |
20160091980 | MOTION AND GESTURE INPUT FROM A WEARABLE DEVICE - This relates to a device that detects a user's motion and gesture input through the movement of one or more of the user's hand, arm, wrist, and fingers, for example, to provide commands to the device or to other devices. The device can be attached to, resting on, or touching the user's wrist, ankle or other body part. One or more optical sensors, inertial sensors, mechanical contact sensors, and myoelectric sensors can detect movements of the user's body. Based on the detected movements, a user gesture can be determined. The device can interpret the gesture as an input command, and the device can perform an operation based on the input command. By detecting movements of the user's body and associating the movements with input commands, the device can receive user input commands through another means in addition to, or instead of, voice and touch input, for example. | 03-31-2016 |
20160091981 | CONFIGURABLE HUMAN-MACHINE INTERFACE - A configurable human-machine interface for controlling an electrical apparatus includes at least one permanent magnet rigidly connected to each of utensils and a magnetometer array including N triaxial magnetometers, mechanically linked to each other without any degree of freedom to retain a known distance between each of the magnetometers, wherein N is a whole number greater than or equal to five, and a processing unit configured to: define, for each permanent magnet of a utensil, a value of at least one variable encoding a position or orientation of same in a three-dimensional reference system fixed without any degree of freedom to the array or the amplitude of the magnetic moment of same, from measurements of the magnetometers of the array, and automatically select a control law based on the value defined for the variable. | 03-31-2016 |
20160091983 | INPUT METHOD AND ELECTRONIC DEVICE - An input method and an electronic device are provided. The input method includes: detecting a presence of a first track set which corresponds to a first predetermined identifier, wherein the first track set comprises at least one track and is inputted into the electronic device; and if the first track set corresponds to the first predetermined identifier, associating a second track set with at least one second predetermined character that is related to the first predetermined identifier, wherein the second track set comprises at least one second track and is inputted into the electronic device after the first track set has been inputted. Recognition ability of the electronic device for the character is improved, and the input efficiency is increased. | 03-31-2016 |
20160091990 | USER TERMINAL DEVICE AND METHOD FOR CONTROLLING THE USER TERMINAL DEVICE THEREOF - A user terminal device and a control method are provided. The user terminal device includes a display, a sensor configured to sense a user interaction on the display, and a controller configured to, in response to a user interaction being sensed by the sensor of a touch being made by an input device of a polyhedral shape that includes different touch patterns on each of a plurality of surfaces, control a function of the user terminal device according to a touch pattern on a touched surface among the plurality of surfaces. | 03-31-2016 |
20160092071 | GENERATE PREVIEW OF CONTENT - In an implementation a display component can display a user interface. An input component can detect an input to the user interface. A controller can render an icon to launch an application and to generate a preview of the content related to the application. If a first input is detected by the input component the application is launched. If a second input is detected by the input component additional content related to the application is rendered. | 03-31-2016 |
20160092726 | USING GESTURES TO TRAIN HAND DETECTION IN EGO-CENTRIC VIDEO - A method, non-transitory computer readable medium, and apparatus for training hand detection in an ego-centric video are disclosed. For example, the method prompts a user to provide a hand gesture, captures the ego-centric video containing the hand gesture, analyzes the hand gesture in a frame of the ego-centric video to identify a set of pixels in the image corresponding to a hand region, generates a training set of features from the set of pixels that correspond to the hand region and trains a head-mounted video device to detect the hand in subsequently captured ego-centric video images based on the training set of features. | 03-31-2016 |
20160093081 | IMAGE DISPLAY METHOD PERFORMED BY DEVICE INCLUDING SWITCHABLE MIRROR AND THE DEVICE - A device including a display configured to display an object at an object display location on the display, the object being associated with information to be provided to a user, and to provide a reflected user image at a reflected user image location on the display; and a processor configured to detect the reflected user image location, and to determine the display location of the object based on the reflected user image location. | 03-31-2016 |
20160093113 | 3D HOLOGRAPHIC VIRTUAL OBJECT DISPLAY CONTROLLING METHOD BASED ON HUMAN-EYE TRACKING - A 3D holographic virtual object display controlling method and apparatus based on human-eye tracking are provided. The display controlling method comprises the following steps of: activating tracking of human eyes of a user; tracking motions of eyeballs of the user; controlling a 3D holographic virtual object presented in a display interface to rotate in response to the motions of the eyeballs of the user; and ending up the tracking of the human eyes of the user. Thereby, the present disclosure can control rotation of a 3D holographic virtual object presented in a display interface by tracking eyeballs and in response to the motions of the eyeballs, which makes the operations convenient and easy. | 03-31-2016 |
20160098083 | USE OF LIGHT TRANSMISSION THROUGH TISSUE TO DETECT FORCE - Various embodiments relate to apparatuses and methods of using light transmission thought compressed living tissue to detect force. Transmission of light through living tissue such as a finger is affected by how much the tissue is compressed, for example by the finger being pressing on a surface. Light is introduced into the tissue, passes through the tissue, and a sensor receives the light exiting the tissue. The compression of the tissue can be determined using various characteristics of the received light, such as the light intensity, as determined based at least partly on sensor readings. | 04-07-2016 |
20160098085 | Systems and Methods For Haptic Remote Control Gaming - Systems and methods for haptic remote control gaming are disclosed. In one embodiment a portable multifunction device receives information from a remotely controllable device. The portable multifunction device can be operable as a remote control for the remotely controllable device. The portable multifunction device may be a smartphone, a tablet computer, or another suitable electronic device. The portable multifunction device can determine a haptic effect based at least in part on the information received from the remotely controllable device. The portable multifunction device may generate a signal configured to cause an actuator to output the determined haptic effect. The portable multifunction device can output the signal. | 04-07-2016 |
20160098088 | HUMAN MACHINE INTERFACE APPARATUS FOR VEHICLE AND METHODS OF CONTROLLING THE SAME - A human machine interface (HMI) minimizing the number of gestures for operation control in which user-intended operation commands are accurately recognized by dividing a vehicle interior into a plurality of regions. The HMI receives an input of a gesture according to each region, and controls any one device according to the gesture. Convenience of a user is improved because the gesture may be performed in a state in which region restriction is minimized by identifying an operation state and an operation pattern of an electronic device designated according to each region to recognize the user's intention when the user performs the gesture in a boundary portion between two regions or even in multiple regions. | 04-07-2016 |
20160098089 | Non-Line-of-Sight Radar-Based Gesture Recognition - This document describes techniques and devices for non-line-of-sight radar-based gesture recognition. Through use of the techniques and devices described herein, users may control their devices through in-the-air gestures, even when those gestures are not within line-of-sight of their device's sensors. Thus, the techniques enable users to control their devices in many situations in which control is desired but conventional techniques do permit effective control, such as to turn the temperature down in a room when the user is obscured from a thermostat's gesture sensor, turn up the volume on a media player when the user is in a different room than the media player, or pause a television program when the user's gesture is obscured by a chair, couch, or other obstruction. | 04-07-2016 |
20160098090 | KINETIC USER INTERFACE - A computerized kinetic control system comprising: a kinetic sensor; a display device; and a hardware processor configured to: (a) display, using said display device, a GUI (Graphic User Interface) menu comprising at least two options being disposed away from a center of said display device and at different polar angles relative to the center of said display device, (b) detect, using said kinetic sensor, motion of a limb of a user, and (c) select a first option of the at least two options, wherein the selecting is based on a correspondence between a direction of the motion detected and a polar angle of the first option relative to the center of the display device. | 04-07-2016 |
20160098093 | DISPLAY APPARATUS AND METHOD FOR CONTROLLING THE SAME - A display apparatus is provided. The display apparatus includes a communication interface unit configured to receive eye gaze information of a user and user gesture information from an external apparatus; a display unit configured to display a preset screen; and a controlling unit configured to control the display unit so that a preset object is displayed on the preset screen if it is determined that an eye gaze of the user is directed toward the preset screen using the received eye gaze information and a user gesture corresponding to the received gesture information is sensed. | 04-07-2016 |
20160098094 | USER INTERFACE ENABLED BY 3D REVERSALS - A method and system are provided where a user can imitate a tap gesture with a fingertip in space without a physical surface to tap on and, if the trajectory of the fingertip is tracked in the course of movement, the tap gesture is performed by a 3D reversal. In a 3D reversal, the trajectory becomes substantially reversed. | 04-07-2016 |
20160098095 | Deriving Input from Six Degrees of Freedom Interfaces - The present invention relates to interfaces and methods for producing input for software applications based on the absolute pose of an item manipulated or worn by a user in a three-dimensional environment. Absolute pose in the sense of the present invention means both the position and the orientation of the item as described in a stable frame defined in that three-dimensional environment. The invention describes how to recover the absolute pose with optical hardware and methods, and how to map at least one of the recovered absolute pose parameters to the three translational and three rotational degrees of freedom available to the item to generate useful input. The applications that can most benefit from the interfaces and methods of the invention involve 3D virtual spaces including augmented reality and mixed reality environments. | 04-07-2016 |
20160098160 | SENSOR-BASED INPUT SYSTEM FOR MOBILE DEVICES - A graphical user interface is displayed on a device display of a mobile device, by at least one data processor executing a display engine. The graphical user interface includes a first set of user-input elements capable of receiving user input defining a command to be performed by the mobile device. The display engine, executed by a data processor, receives sensor data from a sensor operatively connected to the mobile device. The sensor data corresponds to a user motion that is detected by the at least one sensor. The display engine, executed by a data processor, determines, based on the received sensor data, a second set of user-input elements to display on the graphical user interface. The second set of user-input elements is displayed on the graphical user interface by the display engine. | 04-07-2016 |
20160098969 | METHOD AND APPARATUS FOR GESTURE INTERACTION WITH A PHOTO-ACTIVE PAINTED SURFACE - A method and apparatus for gesture interaction with an image displayed on a painted wall is described. The method may include capturing image data of the image displayed on the painted wall and a user motion performed relative to the image. The method may also include analyzing the captured image data to determine a sequence of one or more physical movements of the user relative to the image displayed on the painted wall. The method may also include determining, based on the analysis, that the user motion is indicative of a gesture associated with the image displayed on the painted wall, and controlling a connected system in response to the gesture. | 04-07-2016 |
20160100158 | Display Device with Curved Display Function - A display device includes a flat-panel screen, an image capturing module, an eye tracking module, a screen rotating and tilting module, a flat-to-curve emulation module, and a flat-to-curve mechanical module. The display device employs the flat-to-curve emulation module to transform a rectangular flat image displayed by the flat-panel screen to a pincushion-like flat image according to a user's eye position; therefore, it may automatically provide the “emulated” curved image according to the user's eye position so that a user may perceive a visual effect like seeing a curved image. The display device further employs the flat-to-curve mechanical module to bend the flat-panel screen into a curved screen according to the user's eye position to bend the flat image displayed by the flat-panel screen into a curved image; therefore, it may automatically provide the “real” curved image according to the user's eye position to enhance the visual effect. | 04-07-2016 |
20160103483 | Methods to Pan, Zoom, Crop, and Proportionally Move on a Head Mountable Display - Methods, apparatus, and computer-readable media are described herein related to displaying and cropping viewable objects. A viewable object can be displayed on a display of a head-mountable device (HMD) configured with a hand-movement input device. The HMD can receive both head-movement data corresponding to head movements and hand-movement data from the hand-movement input device. The viewable object can be panned on the display based on the head-movement data. The viewable object can be zoomed on the display based on the hand-movement data. The HMD can receive an indication that navigation of the viewable object is complete. The HMD can determine whether a cropping mode is activated. After determining that the cropping mode is activated, the HMD can generate a cropped image of the viewable object on the display when navigation is complete. | 04-14-2016 |
20160103484 | GAZE TRACKING THROUGH EYEWEAR - A method to furnish input representing gaze direction in a computer system operatively coupled to a vision system. In this method, a first image of an eye at a first level of illumination is acquired by a camera of the vision system. The first image is obtained from the camera, and a second image of the eye corresponding to a second, different level of illumination is also obtained. Brightness of corresponding pixels of the first and second images is compared in order to distinguish a reflection of the illumination by the eye from a reflection of the illumination by eyewear. The input is then furnished based on the reflection of the illumination by the eye. | 04-14-2016 |
20160103486 | Method and Apparatus for Communication Between Humans and Devices - This invention relates to methods and apparatus for improving communications between humans and devices. The invention provides a method of modulating operation of a device, comprising: providing an attentive user interface for obtaining information about an attentive state of a user; and modulating operation of a device on the basis of the obtained information, wherein the operation that is modulated is initiated by the device. Preferably, the information about the user's attentive state is eye contact of the user with the device that is sensed by the attentive user interface. | 04-14-2016 |
20160103488 | HAPTICALLY-ENABLED DEFORMABLE DEVICE WITH RIGID COMPONENT - A device includes a flexible component, such as a display and a rigid component coupled to the flexible component. An input circuit coupled to the flexible component can detect deformation of the flexible component. A response module can provide haptic feedback or another response based on the deformation. In some devices, the rigid component can be made of rigid members attached by a flexible connection and coupled to a motor. The flexible connection can be altered by the motor to cause the flexible component to deform from one state to another. | 04-14-2016 |
20160103495 | TERMINAL DEVICE AND METHOD FOR CONTROLLING OPERATIONS - A device includes circuitry configured to acquire detection data from at least one sensor device corresponding to movement of the device. The circuitry is also configured to control at least one sensitivity detection mode of the at least one sensor device and determine a sampling rate for the at least one sensor device based on the at least one sensitivity detection mode. The circuitry is also configured to determine gestures performed by a user based on the detection data from the at least one sensor device and control operation of the device based on the gestures performed by the user. | 04-14-2016 |
20160103497 | INFORMATION PROCESSING APPARATUS - An information processing apparatus is provided with a projection unit, a mirror unit that includes a mirror, and a detection unit. In an in-plane direction of the mirror, let a first direction be a projection line direction when an optical axis of the projection unit is projected onto the plane of the mirror, and in the in-plane direction of the mirror, let a second direction be a direction that is perpendicular to the first direction. The apparatus further includes a first supporting unit that supports the mirror; and a second supporting unit that supports the mirror. The first supporting unit and the second supporting unit are provided such that a primary natural frequency in the second direction is lower than a primary natural frequency in the first direction. | 04-14-2016 |
20160103498 | INFORMATION PROCESSING APPARATUS - An information processing apparatus includes a projection unit that projects an image, a projection mirror that reflects an image projected by the projection unit towards a projection surface, an image capturing unit that captures an image of a subject placed on the projection surface, and an image capturing mirror that is arranged in an image capturing optical path from the subject to the image capturing unit in order to capture an image of the subject placed on the projection surface. The projection unit and the image capturing unit are arranged below the projection mirror and the image capturing mirror. | 04-14-2016 |
20160103499 | SYSTEMS AND METHODS FOR DISTINGUISHING GESTURES - A gesture control system includes a processor, the processor in communication with a plurality of sensors. The processor is configured to perform the steps of detecting, using the plurality of sensors, a gesture in a volume occupied by a plurality of occupants, analyzing a prior knowledge to associate the gesture with one of the plurality of occupants, and generating an output, the output being determined by the gesture and the one of the plurality of occupants. | 04-14-2016 |
20160103501 | User-Directed Motion Gesture Control - Apparatuses and methods for user-directed motion gesture control are disclosed. According to aspects of the present disclosure, direct user inputs can be used to provide access control of a device. In some embodiments, a wearable mobile device may be configured to accept user commands, and be configured to sense multitude of use, use environment, and use contexts. The wearable mobile device may include a memory configured to store a set of reference access control motion gesture sequences, one or more sensors configured to sense a motion gesture sequence, and a controller configured to determine a valid access control motion gesture sequence using the motion gesture sequence and the set of reference access control motion gesture sequences. | 04-14-2016 |
20160103506 | INPUT DEVICE, METHOD FOR CONTROLLING INPUT DEVICE, AND NON-TRANSITORY COMPUTER-READABLE RECORDING MEDIUM - An input device includes a movement detector that detects a tilt of the input device during an input of information corresponding to a gesture operation, and a controller that executes a control to continue the input of information after the gesture operation, until the tilt detected by the movement detector becomes a predetermined status. | 04-14-2016 |
20160104051 | Smartlight Interaction System - The conference room automation apparatus employs a processor-based integrated movement sensor, lights, cameras, and display device, such as a projector, that senses and interprets human movement within the room to control the projector in response to that movement and that captures events occurring in the room. Preferably packed in a common integrated package, the apparatus employs a layered software/hardware architecture that may be readily extended as a platform to support additional third-party functionality. | 04-14-2016 |
20160104454 | ELECTRONIC DEVICE AND METHOD FOR ADJUSTING BRIGHTNESS OF DISPLAY DEVICE OF THE ELECTRONIC DEVICE - Method of adjusting a brightness of a display device of an electronic device includes obtaining data of a facial feature of a user of the electronic device. The brightness of the display device is adjusted when the brightness of the display device is not suitable for the user, which is determined by comparing the obtained data with preset data of the facial feature. | 04-14-2016 |
20160109781 | AUTO FOCUS DEVICE AND METHOD FOR LIQUID CRYSTAL DISPLAY - An auto focus device comprises a focus panel and a focus controller. The focus panel comprises liquid crystal between a first light-transmissive conductive film and a second light-transmissive conductive film, and the focus controller is configured to apply a voltage between the two light-transmissive conductive films at the position of at least one pixel, so that the liquid crystal at the position will have an expected focal length. An auto focus method comprises: acquiring information on eyesight status of a user; acquiring information on posture of the user; calculating expected focal length of the liquid crystal between the two light-transmissive conductive films at the position of at least one pixel according to the acquired user information; selecting a voltage to be applied between the two light-transmissive conductive films at the position according to the expected focal length; and applying the voltage between the two light-transmissive conductive films at the position. | 04-21-2016 |
20160109936 | DISPLAY CONTROL METHOD AND PROTECTIVE COVER IN ELECTRONIC DEVICE - An input control method and an electronic device are provided. The electronic device includes a display, a memory for storing data, and a processor functionally connected to the display and the memory. The processor detects a first input, determines whether a cover for the electronic device is covered on the display when the first input is detected, wherein the cover includes a first visual region and a second visual region, and displays first data corresponding to the first input through at least one of a first region and a second region of the display when the cover covers the display, wherein the first region is arranged in a location corresponding to the first visual region and the second region is arranged in a location corresponding to the second visual region. | 04-21-2016 |
20160109937 | METHOD AND APPARATUS FOR PROCESSING SCREEN USING DEVICE - A method and an apparatus for processing a screen by using a device are provided. The method includes obtaining, at the second device, a display screen displayed on the first device and information related to the display screen according to a screen display request regarding the first device, determining, at the second device, an additional screen based on the display screen on the first device and the information related to the display screen, and displaying the additional screen near the display screen on the first device. | 04-21-2016 |
20160109941 | SYSTEM AND METHOD FOR RECOMMENDING CONTENT TO A USER BASED ON USER'S INTEREST - Systems and methods for recommending content to a user based on the user's interests are described herein. In one example, the method comprises receiving at least one image of the user, and analyzing the at least one image to determine one or more facial attributes of the user. The method further comprises processing the at least one image to determine the gaze parameters of the user, determining based on the gaze parameters, an object of interest of the user and retrieving the characteristics of the object of interest. The method further comprises ascertaining, based on the facial attributes, an emotional index associated with the user, and generating recommendations of the content for the user based in part on the emotional index and characteristics of the object of interest. | 04-21-2016 |
20160109942 | SYSTEM, APPARATUS AND METHOD FOR DYNAMICALLY ADJUSTING A VIDEO PRESENTATION BASED UPON AGE - A system, method and apparatus are set forth which adjusts one or more of the brightness, vibrancy and color shift of displayed content based upon the at least approximate age of the viewer. At a display ( | 04-21-2016 |
20160109946 | SYSTEMS AND METHODS FOR GAZE INPUT BASED DISMISSAL OF INFORMATION ON A DISPLAY - According to the invention, a method for dismissing information from a display device based on a gaze input is disclosed. The method may include determining that an object has been displayed on a display device. The method may also include determining an area on the display device associated with the object. The method may further include determining a gaze location of a user on the display device. The method may additionally include causing the object to not be displayed on the display device, based at least in part on the gaze location being located within the area for at least a first predetermined length of time. | 04-21-2016 |
20160109947 | SYSTEM FOR GAZE INTERACTION - A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen. | 04-21-2016 |
20160109954 | Gesture Recognition Cloud Command Platform, System, Method, and Apparatus - Systems and methods described herein are for transmitting a command to a remote system. A processing system determines the identity of the user based on the unique identifier and the biometric information. Thereafter, a sensor detects a gesture performed by the user. The sensor is configured to detect the gesture performed by the user when the user is located within the detectable range of the wireless antenna. The processing system determines an action associated with the detected gesture based on the identity of the user and sends a command to a remote computer system to cause it to perform the action associated with the detected gesture. | 04-21-2016 |
20160109956 | Sensor and Tag to Determine a Relative Position - A computing device can include a first enclosure and a display enclosure. A tag can be on the first enclosure. A sensor on the display enclosure can detect the tag location and generate sensor data. A controller can determine from the sensor data the relative position between the display enclosure and the first enclosure. | 04-21-2016 |
20160109958 | WEARABLE DEVICE AND METHOD OF TRANSMITTING CONTENT - A wearable device includes a communication unit configured to communicate with a host device, a sensing unit configured to recognize at least one motion, and a control unit configured to control the host device to transmit to an external device first content mapped to a first motion recognized by the sensing unit. | 04-21-2016 |
20160109961 | SYSTEMS, METHODS, APPARATUSES, COMPUTER READABLE MEDIUM FOR CONTROLLING ELECTRONIC DEVICES - This application includes disclosure of methods, systems, apparatuses as well as principles/algorithms that can be implemented on computer readable medium, for defining user gestures, interpreting user actions, communicating and confirming user intent when communicating with electronic devices. A system for controlling an electronic device by a user is disclosed that includes a microprocessor and a communication link. The microprocessor runs control software for receiving a first signal indicative of a first action of the user, receives a second signal indicative of motion or position of a part of the user's body, and generates a command signal for the electronic device based on a user gesture performed by the user. The communication link communicates the command signal to the electronic device. | 04-21-2016 |
20160110150 | DISPLAYING THE DESKTOP UPON DEVICE OPEN - Systems and methods are provides for displaying a desktop for a multi-screen device in response to opening the device. The window stack can change based on the change in the orientation of the device. The system can receive an orientation change that transitions the device from a closed state to an open state. A previously created in the stack can expand over the area of the two or more displays comprising the device when opened. A desktop expands to fill the display area and be displayed on the second of the displays after the device is opened. | 04-21-2016 |
20160110881 | MOTION TRACKING DEVICE CONTROL SYSTEMS AND METHODS - Provided herein are systems and methods for controlling displacement of a location indicator on the display of a controlled computing device in response to movements of a remote controlling object. Various embodiments may further cause the controlled computing device to perform other operations, broadly referred to as command operations, in response to movements of the controlling object. The command operations are distinguishable from the displacement operations. Various embodiments may further distinguish between movement of the controlling object intended to correspond to a displacement operation and movement of the controlling object intended to correspond to a command operation. | 04-21-2016 |
20160111066 | SEGMENT DISPLAY DEVICE - A segment display device includes first and second operation input units; a segment display unit; a storage unit for storage by mapping information which can be displayed in the segment display unit in a display information table including a plurality of cells specified by a plurality of rows and columns; and a display control unit which displays information read from the storage unit to segment display unit; every time an operation is input to the first operation input unit, said display control unit selects information in each cell of the display information table in order in the first direction along the column in the display information table and is displayed; every time that an operation is input to the second operation input unit, information in each cell of the display information table is selected in order in the second direction along the row of the display information table. | 04-21-2016 |
20160112688 | BOUNDLESS PROJECTED INTERACTIVE VIRTUAL DESKTOP - A method for creating a boundless projected interactive virtual desktop, wherein the interactive virtual desktop comprises an adjustable image of a projected portion of an area associated with at least one desktop of a computing device is provided. The method may include integrating a projector and a motion sensor into a device. The method may also include capturing at least one of a location, a change in location, a change in direction, or a change in orientation associated with the device. The method may include computing a projected image. The method may also include coordinating the computed projected image across at least one application running in the device. The method may further include projecting a view of a portion of an area associated with the coordinated projected image, wherein the projected view comprises an interactive virtual desktop. The method may additionally include adjusting the projected view based on a criteria. | 04-21-2016 |
20160116592 | OPEN LOOP CORRECTION FOR OPTICAL PROXIMITY DETECTORS - An optical proximity detector includes a driver, light detector, analog front-end, sensor(s) that sense correction factor(s) (e.g., temperature, supply voltage and/or forward voltage drop), and a digital back end. The driver drives the light source to emit light. The light detector produces a light detection signal indicative of a magnitude and a phase of a portion of the emitted light that reflects off an object and is incident on the light detector. The analog front-end receives the light detection signal and outputs a digital light detection signal, or digital in-phase and quadrature-phase signals, which are provided to the digital back-end. The digital back-end performs closed loop correction(s) for dynamic variation(s) in gain and/or phase caused by a portion of the analog front-end, uses polynomial equation(s) and sensed correction factor(s) to perform open loop correction(s) for dynamic variations in temperature, supply voltage and/or forward voltage drop, and outputs a distance value. | 04-28-2016 |
20160116742 | HEAD WORN DISPLAYING DEVICE EMPLOYING MOBILE PHONE - A head worn displaying device of the disclosure includes an optical reflective lens, a display screen and a cover. The optical lens and the display screen are mounted under the cover, and the cover may be similar to a hat tongue located ahead of the forehead of a user. The display screen can be integrated in a mobile phone, and a holder such as a clamper set is used to clamp the mobile phone. The angle of the optical lens can be adjusted, for example, by using a sliding groove and a magnetic hinge which are located at a front end of the hat tongue. The magnetic hinge slides in the sliding groove to adjust a distance between the display screen and the optical lens. The light rays of the display screen will be reflected by the optical lens to form virtual magnified images with reduced distortions for both eyes at the same time, and the optical lens optically combines the reflected virtual images with and external environmental light rays transmitted through the optical lens to form the augmented reality for the eyes. | 04-28-2016 |
20160116977 | SYSTEMS AND METHODS FOR USE AT A VEHICLE INCLUDING AN EYE TRACKING DEVICE - Systems and methods for a vehicle including an eye tracking device. The systems and methods use input from the eye tracking device. The systems and methods are configured to communicate with a driver based on input from the eye tracking device. | 04-28-2016 |
20160116978 | Systems And Methods For Presenting Location Related Information - Systems and methods for presenting location related information after a user arrives at a place. In an aspect, when a user gazes at a display of standby device, information presentation begins. In another aspect, when a user shakes a standby device and then gazes at it, a presentation starts. Location related info may be sorted and presented by the pointing direction of device. | 04-28-2016 |
20160116979 | EYE GLINT IMAGING IN SEE-THROUGH COMPUTER DISPLAY SYSTEMS - Disclosure herein concerns a method that includes illuminating a user's eye with an illumination source in a head-worn display, capturing an image of the user's eye with an eye camera in the head-worn display, wherein the image includes an eye glint produced by light from the illumination source that is reflected from a surface of the user's eye, determining a size of an eye glint in the captured image, and identifying a change in focus distance for the user's eye in correspondence with a change in the size of the eye glint. | 04-28-2016 |
20160116982 | TERMINAL AND OPERATING METHOD THEREOF - Provided is an operating method of a terminal. The method includes: capturing an image for an object; obtaining movement information on a movement of the object; and storing the obtained movement information corresponding to the captured image. | 04-28-2016 |
20160116983 | USER INPUT METHOD FOR USE IN PORTABLE DEVICE USING VIRTUAL INPUT AREA - A portable device and user input method of the portable device using a virtual input area is provided. The portable device may include a sensor configured to sense a user input to an input area, the input area being at least a portion of an area adjacent to the portable device, a determiner configured to determine a target object corresponding to the user input among at least one input object displayed on the portable device, based on an arrangement of the at least one input object, and a controller configured to generate a control command to control the target object. | 04-28-2016 |
20160116984 | GESTURE BASED CONTROL USING THREE-DIMENSIONAL INFORMATION EXTRACTED OVER AN EXTENDED DEPTH OF FIELD - Systems and methods are described for gesture-based control using three-dimensional information extracted over an extended depth of field. The system comprises a plurality of optical detectors coupled to at least one processor. The optical detectors image a body. At least two optical detectors of the plurality of optical detectors comprise wavefront coding cameras. The processor automatically detects a gesture of the body, wherein the gesture comprises an instantaneous state of the body. The detecting comprises aggregating gesture data of the gesture at an instant in time. The gesture data includes focus-resolved data of the body within a depth of field of the imaging system. The processor translates the gesture to a gesture signal, and uses the gesture signal to control a component coupled to the processor. | 04-28-2016 |
20160116985 | UNIVERSAL TRANSLATOR FOR RECOGNIZING NONSTANDARD GESTURES - A system and method to project gesture patterns of gestural behavior designed for existing gesture systems to those exhibited by persons with limited upper limb mobility, such as quadriplegics due to spinal cord injury (SCI), hemiplegics due to stroke, and persons with other types of disabilities. The system acquires a plurality of gesture instances from a gesture sensor, maps the plurality of gesture instances, determines a union amongst the plurality of gesture instances to thereby acquire a plurality of trajectory points, encodes the plurality of trajectory points into a feature vector, extracts a plurality of features from the feature vector, normalizes the plurality of features, determines at least one transform function from the plurality of features, and generates constrained gestures from the at least one transform function to form at least one gesture set. | 04-28-2016 |
20160116987 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD AND PROGRAM TO RECOGNIZE AN OBJECT FROM A CAPTURED IMAGE - There is provided an information processing apparatus, which includes an input unit and a control unit. The input unit is configured to serially input an image captured by a capturing device. The control unit is configured to detect a user's hand from the input image, to recognize a position, a posture and a size of the hand detected while moving. A predetermined area of a surface on which the hand is moved as an operation area is defined based on the position, the posture and the size of the hand recognized. A virtual three dimensional object for operation by the user is produced disposed on the operation area. Also, an information processing method and a program are provided for the information processing apparatus. | 04-28-2016 |
20160116988 | HAND-WORN DEVICE FOR SURFACE GESTURE INPUT - Embodiments that relate to energy efficient gesture input on a surface are disclosed. One disclosed embodiment provides a hand-worn device that may include a microphone configured to capture an audio input and generate an audio signal, an accelerometer configured to capture a motion input and generate an accelerometer signal, and a controller comprising a processor and memory. The controller may be configured to detect a wake-up motion input based on the accelerometer signal. The controller may wake from a low-power sleep mode in which the accelerometer is turned on and the microphone is turned off and enter a user interaction interpretation mode in which the microphone is turned on. Then, the controller may contemporaneously receive the audio signal and the accelerometer signal and decode strokes. Finally, the controller may detect a period of inactivity based on the audio signal and return to the low-power sleep mode. | 04-28-2016 |
20160116989 | Portable Device Pairing with a Tracking System - In embodiments of portable device pairing with a tracking system, a pairing system includes a portable device that generates device acceleration gesture data responsive to a series of motion gestures of the portable device. The pairing system also includes a tracking system that is configured for pairing with the portable device. The tracking system recognizes the series of motion gestures of the portable device and generates tracked object position gesture data. A pairing service can then determine that the series of motion gestures of the portable device corresponds to the series of motion gestures recognized by the tracking system, and communicate a pairing match notification to both the tracking system and the portable device to establish the pairing. | 04-28-2016 |
20160116990 | DEPTH DETERMINING METHOD AND DEPTH DETERMINING DEVICE OF OPERATING BODY - A locating method, a locating device, a depth determining method, and a depth determining device of an operating body are provided. The locating method includes following steps: deriving an image that includes an operating body; scanning the image transversely according to each of scan lines of the image, and deriving each of width values of the operating body in each of the scan lines according to the bright dot information in each of the scan lines of the image; calculating a relative variation level between each of the width values in the adjacent scan line sequentially; when the relative variation level is less than a threshold, determining a locating point of the operating body according to one of the scan lines and one of the width values corresponding to the relative variation level. | 04-28-2016 |
20160117962 | DISPLAY APPARATUS AND DISPLAY METHOD THEREOF - A display apparatus and method thereof are provided. The display apparatus includes a bendable display, and a controller configured to control the display to be bent to a preset angle based on a number of users of the display apparatus. | 04-28-2016 |
20160121006 | HANDHELD ELECTRONIC DEVICE AND ANTIBACTERIAL METHOD OF THE SAME - A handheld electronic device having an antibacterial mechanism is provided. The handheld electronic device includes a main body, a display panel and a processing module. The display panel is disposed at a first surface of the main body and includes a plurality of display light sources and at least one antibacterial light source. The display light sources generate a display light. The antibacterial light source generates an antibacterial light. The processing module is coupled to the display light sources and the antibacterial light source. The processing module is operated to activate the display light sources to generate the display light during a display period and to activate the antibacterial light source to generate the antibacterial light during an antibacterial period. | 05-05-2016 |
20160123817 | TEMPERATURE SENSOR USING ON-GLASS DIODES - This disclosure provides systems, methods and apparatus for measuring a temperature of a display. In one aspect, a circuit may use one or more stages of diodes or diode-connected transistors providing the functionality of diodes. Each stage may include the functionality of diodes in opposite directions. A direct current (DC) current source or an alternating current (AC) voltage source may be applied to the diodes or diode-connected transistors to measure the temperature of the display. | 05-05-2016 |
20160124498 | METHOD FOR PERFORMING FUNCTION USING SENSOR DATA AND ELECTRONIC DEVICE FOR PROVIDING SAME - An electronic device and method utilizes an external sensor group to facilitate miniaturization the device and repair/replacement of external sensors. An interface connected to an external sensor package including at least one sensor. A processor that when the external sensor package is connected through the interface, determines from which group the external sensor package is included in among pre-configured groups and controls the performance of a function corresponding to the determined group. | 05-05-2016 |
20160124500 | WATCH TYPE CONTROL DEVICE - A watch type control device comprises: a band configured to surround at least a portion of a user's wrist; an IR module disposed on a side of the band facing the user's hand when the watch type control device is worn on the user's wrist, the user's hand and wrist being on the same side of the user's body; and a beam projector disposed on the band and configured to outwardly project a screen, wherein the IR module is configured to recognize a user's hand gesture and to initiate the execution of a program causing the beam projector to output on the screen a display corresponding to the program. | 05-05-2016 |
20160124501 | SECURED MOBILE MAINTENANCE AND OPERATOR SYSTEM INCLUDING WEARABLE AUGMENTED REALITY INTERFACE, VOICE COMMAND INTERFACE, AND VISUAL RECOGNITION SYSTEMS AND RELATED METHODS - Secured remote maintenance, configuration management, and systems engineering apparatuses and methods including wearable augmented reality (AR) interface systems are provided. Embodiments can support secure and remote configuration setting changes (CSC) to a system or subsystem of interest (SSoI) using a head mounted device (HMD), a camera, a visual interface section, and a processing section including a plurality of processing instructions operable to operate a command input (CI) interface using one or more user command input CIs (e.g. voice or motion/gesture input recognition) using a secure user authentication systems. HMD machine vision and pattern recognition systems visually identify a SSoI, displaying a 3D model(s) of the SSoI on the HMD's visual interface using the AR interface system, obtaining a plurality of SSoI data, and displaying one or more of the SSoI data in relation to the 3D model on the HMD visual interface to support various tasks including CI directed CSCs. | 05-05-2016 |
20160124503 | DISPLAY DEVICE AND DISPLAY METHOD THAT DETERMINES INTENTION OR STATUS OF A USER - The present invention provides a display apparatus and a display method for realizing control for display operations by a user precisely reflecting the user's status, i.e., the user's intentions, visual state and physical conditions. Worn as an eyeglass-like or head-mount wearable unit for example, the display apparatus of the present invention enables the user to recognize visibly various images on the display unit positioned in front of the user's eyes thereby providing the picked up images, reproduced images, and received images. As control for various display operations such as switching between the display state and the see-through state, display operation mode and selecting sources, the display apparatus of the present invention acquires information about either behavior or physical status of the user, and determines either intention or status of the user in accordance with the acquired information, thereby controlling the display operation appropriately on the basis of the determination result. | 05-05-2016 |
20160124504 | WEARABLE DEVICE AND METHOD OF CONTROLLING THEREFOR - A method of controlling a wearable device according to one embodiment of the present specification can include the steps of displaying content on a display unit of the wearable device, sensing a tilt angle of the wearable device and providing a control interface providing control of the content. And, the step of providing the control interface can include the steps of mapping the control interface to the ground based on the sensed tilt angle and a state of the wearable device and displaying the mapped control interface on the display unit. | 05-05-2016 |
20160124505 | OPERATING AN ELECTRONIC PERSONAL DISPLAY USING EYE MOVEMENT TRACKING - An electronic personal display is operated using a camera of an electronic personal display to track a user's eye movement. Based on the tracking, the user's gaze is correlated with a selectable region of the electronic personal display. Responsible to the gaze being correlated with the selectable region for at least a predetermined time, an operation of the electronic personal display is implemented wherein the operation is associated with the selectable region. Various embodiments do not require any external device, such as eye wear, as a part of tracking the user's eye movement. | 05-05-2016 |
20160124506 | ELECTRONIC DEVICE AND METHOD FOR CONTROLLING EXTERNAL OBJECT - Methods and apparatuses are provided for controlling an external object by an electronic device. A line of sight of a user is determined using an image sensor of the electronic device. An object located outside of the electronic device is determined based on the line of sight of the user. Object information regarding the object is determined. A user input with respect to the object is received from the user. The object or another electronic device associated with the object is controlled based on the user input and the object information. | 05-05-2016 |
20160124522 | ELECTRONIC DEVICE, METHOD, STORAGE MEDIUM - According to one embodiment, an electronic device includes a circuitry. The circuitry is configured to detect an inclination of the electronic device. The circuitry is configured to detect, as a basic position, a position of the electronic device assumed when a user performs a first operation. The circuitry is configured to release a lock on the electronic device when the inclination of the electronic device is a first inclination with respect to the basic position. | 05-05-2016 |
20160124523 | ELECTRONIC DEVICE, DISPLAY DEVICE, AND METHOD FOR CONTROLLING THE SAME - An electronic device, a display device, and a method for controlling the same are provided. The method for controlling an electronic device includes communicating with a display device that displays content; sensing a motion of the electronic device, and determining whether the electronic device is in an alignment state for providing a content service from the display device based on the result of motion sensing; transmitting an event signal that includes alignment state information to the display device if the electronic device is in the alignment state; and displaying content information about the content that is received from the display device. Accordingly, the electronic device can receive the content information related to the content that is being displayed on the display device more intuitively and quickly. | 05-05-2016 |
20160124579 | CONTROLLING MULTIPLE DEVICES WITH A WEARABLE INPUT DEVICE - Embodiments include an electronic input system, which has a wearable input device having an accelerometer and circuitry configured for an operation detection mechanism and an input communication mechanism. The operation detection mechanism acquires acceleration data from the accelerometer to detect an input event. The electronic input system also has a first electronic device having circuitry for a first communication mechanism, a first event analyzer, a screen display, and a first data communication mechanism. The electronic input system also has a second electronic device having circuitry configured for a second communication mechanism, a second event analyzer, a second data communication mechanism, a screen combining mechanism, and a screen display. Embodiments also include a method of communicating an event between an input device and one or more electronic portable devices, such as the first electronic device and the second electronic device. | 05-05-2016 |
20160124707 | Facilitating Interaction between Users and their Environments Using a Headset having Input Mechanisms - A headset is described herein for presenting audio information to the user as the user interacts with a space, e.g., as when a user navigates over a route within a space. The headset may include a set of input mechanisms for receiving commands from the user. The commands, in turn, invoke respective space-interaction-related functions to be performed by a space interaction (SI) module. The headset may operate with or without a separate user computing device. | 05-05-2016 |
20160131902 | SYSTEM FOR AUTOMATIC EYE TRACKING CALIBRATION OF HEAD MOUNTED DISPLAY DEVICE - A method of automatically calibrating a head mounted display for a user is disclosed. The method includes automatically calculating an inter-pupillary distance value for the user, comparing the automatically calculated inter-pupillary distance value to a previously determined inter-pupillary distance value, determining if the automatically calculated inter-pupillary distance value matches the preexisting inter-pupillary distance value, and automatically calibrating the head mounted display using calibration data associated with matching previously determined inter-pupillary distance value. | 05-12-2016 |
20160132102 | ELECTRONIC APPARATUS AND METHOD OF DETECTING TAP OPERATION - To provide an electronic apparatus, a method of detecting a tap operation, etc. for performing appropriate detection processing of a tap operation. The electronic apparatus includes a setting unit | 05-12-2016 |
20160132108 | ADAPTIVE MEDIA FILE REWIND - An addressable device receives a user-characterized rewind description. The addressable device stores the user-characterized rewind description. The addressable device renders an at least one media file to include a resume point of the at least one media file. The addressable device receives a command to preferentially rewind. The addressable device, responsive to receiving the command to preferentially rewind, re-renders the at least one media file to the user-characterized rewind description such that the display shows the at least one media file at a replay point of the media file at least the user-characterized rewind description prior to the resume point. | 05-12-2016 |
20160132109 | Command glove - Glove embedded with sensors for interfacing functions of an electronic device is disclosed. Invention allows for manipulation of the basic controls of an electronic device with one hand and without looking at the device or the control apparatus. | 05-12-2016 |
20160132111 | METHOD OF DETECTING USER INPUT IN A 3D SPACE AND A 3D INPUT SYSTEM EMPLOYING SAME - A 3D input system and an angle encoder are disclosed. The 3D input system comprises a computing device and one or more position sensing gloves. The position sensing glove comprises a plurality of angle encoders each installed thereon at a location about a finger joint. An inertial measurement unit (IMU) is installed on the glove. A firmware uses data from the angle encoders and IMU to calculate fingertip positions in a 3D space. The firmware generates keystrokes on a virtual keyboard based on the fingertip positions. The angle encoder comprises a first and a second components rotatable with respect to each other, and an encoder pattern comprising codewords for indicating the angle between the first and second components. The encoder pattern comprises a set of base encoder channels coded with a conventional Gray code, and a set of Booster channels for improving the resolution of angle measurement. | 05-12-2016 |
20160132114 | HAPTIC TRIGGER MODIFICATION SYSTEM - A system is provided that modifies a haptic effect experienced at a user input element. The system sends a haptic instruction and a haptic effect definition to a peripheral device. The system further receives user input data including a position of the user input element, or a force applied to the user input element. The system further modifies the haptic effect definition based on the received user input data. The system further sends a new haptic instruction and the modified haptic effect definition to the peripheral device. The system further causes a haptic output device to modify a haptic effect based on the modified haptic effect definition at the user input element of the peripheral device in response to the new haptic instruction. | 05-12-2016 |
20160132116 | HAPTIC CONTROLLER - An advanced haptic gamepad is provided. A controller having a plurality of surfaces, and a haptic output device located within its housing and coupled to an isolated deformable region disposed at one of the plurality of surfaces is provided. The isolated deformable region expands and contracts in response to the haptic output device. In addition, a controller having a plurality of isolated surface regions, and a plurality of haptic output devices located within its housing and coupled to respective isolated surface regions is provided. Each of the isolated surface regions is configured to provide localized haptic effects. | 05-12-2016 |
20160132121 | INPUT DEVICE AND DETECTION METHOD - An input device includes: a memory configured to store a length of a finger; and a processor configured to obtain a depth image that includes a hand and has pixel values corresponding to a distance from a camera to subjects including the hand, detect a hand area that corresponds to the hand from among the subjects, from the depth image, identify a tip end of the hand area and a base of the finger, identify a first three-dimensional position that corresponds to the tip end and a second three-dimensional position that corresponds to the base, based on the pixel values of the depth image, identify a direction from the second three-dimensional position to the first three-dimensional position, calculate a third three-dimensional position that corresponds to a fingertip of the finger based on the direction and the length, and generate an input signal according to the third three-dimensional position. | 05-12-2016 |
20160132124 | GESTURE DETERMINATION APPARATUS AND METHOD, GESTURE OPERATION APPARATUS, PROGRAM, AND RECORDING MEDIUM - A hand region (Rh) of an operator is detected from a captured image, and positions of a palm center (Po) and a wrist center (Wo) are determined, and origin coordinates (Cho) and a direction of an coordinate axis (Chu) of a hand coordinate system are calculated. A shape of the hand in the hand region (Rh) is detected using the hand coordinate system, and a shape feature quantity (D | 05-12-2016 |
20160132125 | SYSTEM AND METHOD FOR GENERATING GESTURES - Precise analysis and description of movement and gestures by a human or other object are carried out by a computing device assisted by camera and distance sensors. The same gestures and movements of an object in different locations and in different orientations may be identified according to coordinates and offsets applied to coordinates, and gesture-making areas projected so as to surround the object. The techniques can accurately identify actions of movements of the human or other object. | 05-12-2016 |
20160132126 | SYSTEM FOR INFORMATION TRANSMISSION IN A MOTOR VEHICLE - A system for information transmission in a motor vehicle and methods of operation are disclosed. A steering wheel, a dashboard with a cover, a dashboard display area with a dashboard display, and a display device arranged in the area of a windshield are provided in a passenger compartment of the motor vehicle. The system is designed for gesture recognition and comprises a gesture recognition unit with at least one gesture recognition sensor. The gesture recognition sensor is configured to detect movements in a perceivable gesture area. The gesture recognition sensor is arranged in the viewing direction of a vehicle driver, behind the steering wheel, under the cover in the dashboard display area. The dashboard display and the display device are designed for the representation of interactive menus. | 05-12-2016 |
20160132127 | Method and Device for Determining User Input on Basis of Visual Information on User's Fingernails or Toenails - According to one aspect of the present invention, there is provided a method comprising the steps of: acquiring first information on an action performed or a form made by a user and second information on the user's fingernails or toenails, wherein the second information is visual information; and determining the user's input by determining a first element of the user's input on the basis of the first information and determining a second element of the input on the basis of the second information. | 05-12-2016 |
20160133221 | Gaze Driven Display Front of Screen Performance - An information handling system includes a display, a gaze detector that determines a location on the display that corresponding with where a user is looking at the display, and a processor. The processor receives the location from the gaze detector, determines a non-power-reduced portion of the display that includes the location, determines a power-reduced portion of the display that is exclusive of the non-power-reduced portion, receives data associated with a pixel of an image included in the power-reduced portion, changes the data such that a first power level consumed by the display when displaying the pixel with the changed data is less than a second power level consumed by the display when displaying the pixel with the unchanged data, and sends the changed data to the display. | 05-12-2016 |
20160133226 | SYSTEM AND METHOD FOR MULTI-DISPLAY - Disclosed herein are a multi-display system and method. The system may include a plurality of displays for displaying an image and a wireless image transmission device which is connected to the plurality of displays through wireless communication, and which searches for a connectable display among the plurality of displays, receives a signal transmitted from the searched display, and detects a display for displaying the image among the plurality of displays based on the strength of the received signal. | 05-12-2016 |
20160137064 | TOUCH INPUT DEVICE AND VEHICLE INCLUDING THE SAME - A touch input device includes a touch unit to which a user is able to input a touch gesture, wherein the touch unit includes a concave shape, and gradually deepens from an edge portion toward a central portion. | 05-19-2016 |
20160139268 | VISUAL DISPLAY WITH ILLUMINATORS FOR GAZE TRACKING - A visual display includes hidden reference illuminators adapted to emit invisible light for generating corneo-scleral reflections on an eye watching a screen surface of the display. The tracking of such reflections and the pupil center provides input to gaze tracking. A method for equipping and an LCD with a reference illuminator are also provided. Also provides are a system and method for determining a gaze point of an eye watching a visual display that includes reference illuminators. The determination of the gaze point may be based on an ellipsoidal cornea model. | 05-19-2016 |
20160139659 | DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A display apparatus installed in a predetermined installation surface, includes: a display configured to display an image; a sensing module configured to include a circuit portion generating a wireless transmission signal, a transmitter being in electric contact with the circuit portion and transmitting the wireless transmission signal from the circuit portion to an external object to be sensed, and a receiver being in contact with the circuit portion and receiving the wireless reception signal reflected from the object to be sensed; and at least one processor configured to determine that the object to be sensed is moving if a change in amplitude of the wireless transmission signal and the wireless reception signal in the sensing module is higher than a preset first threshold and a phase difference between the wireless transmission signal and the wireless reception signal is higher than a preset second threshold, and perform a preset corresponding signal process in accordance with the determination results. | 05-19-2016 |
20160139660 | MODULAR APPARATUS AND SYSTEM FOR RECONFIGURABLE USER INPUTS - The present disclosure relates to a modular apparatus and system for providing customized, reconfigurable user inputs. In an aspect, there is provided a modular apparatus comprising a plurality of reconfigurable Input Modules with different types of user inputs, such as buttons, sliders, knobs, joysticks, trackballs, touch pads, touch screens, and other types of user interfaces. The Input Modules may be physically interconnected to a Master Module which is adapted to communicate with each Input Module and to a System Controller Application running on a connected computing device. The Input Modules are reconfigurable into any number of different physical layouts. The Master Module determines the physical layout of the connected Input Modules, and communicates the layout to the System Controller Application. The function of each Input Module is then programmed via the System Controller Application and the Master Module for performing specific functions in a compatible computer application. | 05-19-2016 |
20160139662 | CONTROLLING A VISUAL DEVICE BASED ON A PROXIMITY BETWEEN A USER AND THE VISUAL DEVICE - A visual device may be configured to control a display of the visual device based on a proximity to a user of the visual device. Accordingly, the visual device receives an input associated with a user of the visual device. The visual device determines an identity of the user of the visual device based on the input associated with the user. The visual device configures the visual device based on the identity of the user. The visual device determines that the visual device configured based on the identity of the user is located at an impermissible distance from a portion of the face of the user. The visual device causes a display of the visual device to interrupt presentation of a user interface based on the determining that the visual device is located at the impermissible distance from the portion of the face of the user. | 05-19-2016 |
20160139663 | SYSTEM AND METHOD FOR CONTROL BASED ON FACE OR HAND GESTURE DETECTION - System and method for control using face detection or hand gesture detection algorithms in a captured image. Based on the existence of a detected human face or a hand gesture in an image captured by a digital camera (still or video), a control signal is generated, and provided to a device. The control may provide power or disconnect power supply to the device (or part of the device circuits). Further, the location of the detected face in the image may be used to rotate a display screen horizontally, vertically, or both, to achieve a better line of sight with a viewing person. If two or more faces are detected, the average location is calculated and used for line of sight correction. A linear feedback control loop is implemented wherein detected face deviation from the optimum is the error to be corrected by rotating the display to the required angular position. A hand gesture detection can be used as a replacement to a remote control wherein the various hand gestures controls the various function of the controlled unit, such as a television set. | 05-19-2016 |
20160139664 | LINE-OF-SIGHT PROCESSING METHOD, LINE-OF-SIGHT PROCESSING SYSTEM AND WEARABLE DEVICE - The present disclosure provides a line-of-sight (LOS) processing method, an LOS processing system and a wearable device. The LOS processing method includes steps of: collecting a head deflection angle and an eyeball deflection angle of a user; determining a position of the user's LOS at a display interface in accordance with the head deflection angle and the eyeball deflection angle of the user; acquiring action information of the user's eyeball; identifying an action command corresponding to the action information of the user's eyeball; and performing a corresponding operation on a display image at the position of the user's LOS in accordance with the identified action command. | 05-19-2016 |
20160139665 | DYNAMIC EYE TRACKING CALIBRATION - A user of a computing device may interact with and control objects and applications displayed on the computing device through the user's eye movement. Detected gaze locations are correlated with actions performed by the user and compared with typical gaze locations for those actions. Based on differences between the detected and expected gaze locations, the eye tracking system can be recalibrated. An area around a gaze location encompassing a set of likely active locations can be enlarged, effectively prompting the user to interact with the desired active location again. The enlarging of the area serves to separate the active locations on the screen, reducing the probability of interpreting the user's gaze incorrectly. | 05-19-2016 |
20160139667 | Input device, information processing device, and input method - To provide a device that efficiently selects desired data out of a large number of selectable data. | 05-19-2016 |
20160139673 | ROTATING DISPLAY CONTENT RESPONSIVE TO A ROTATIONAL GESTURE OF A BODY PART - A method and a system for rotating content of an electronic display responsive to rotational human gestures are provided herein. The system that implements the method may include: a display; at least one capturing device configured to capture a body part in front of said display; a processor; a rotational gesture recognition module executed by the processor and configured to detect a predefined rotational gesture and to generate an instruction to rotate the content of the display in approximately 90° only when the rotational gesture displacement goes beyond a predefined threshold. | 05-19-2016 |
20160139674 | INFORMATION PROCESSING DEVICE, IMAGE PROJECTION DEVICE, AND INFORMATION PROCESSING METHOD - An information processing device includes a storage and an circuit. The storage stores first distance information indicating a distance between a reference point and a first surface and first shape information indicating a surface shape of the first surface. The circuit acquires second distance information indicating distance between the reference point and an object. The circuit determines if an operation inputted by user is present or absent by using the second distance information. The circuit determines whether a first condition is satisfied or not by comparing the first shape information with second shape information. The circuit updates the first distance information based on the second distance information when the operation is determined to be absent and the first condition is determined to be satisfied. | 05-19-2016 |
20160139675 | RECOGNITION DEVICE, METHOD, AND STORAGE MEDIUM - A recognition device according to an embodiment described herein includes a hardware processor. The hardware processor generates a trajectory of a pointer based at least in part on information relating to a movement of the pointer, detects that the pointer contacts an operation surface, determines a detection time when the pointer contacts the operation surface, and determines a detection position of the pointer at the detection time, sets the detection position as a starting point, determines an ending point corresponding to the starting point based on at least the detection time or the detection position, and sets the trajectory from the starting point to the ending point as a target trajectory, and determines a command input by a user with a gesture of the pointer based on the target trajectory. | 05-19-2016 |
20160139676 | SYSTEM AND/OR METHOD FOR PROCESSING THREE DIMENSIONAL IMAGES - The subject matter disclosed herein relates to a method and/or system for projection of images to appear to an observer as one or more three-dimensional images. | 05-19-2016 |
20160139698 | DYNAMIC TACTILE INTERFACE AND METHODS - A dynamic tactile interface includes: a substrate including a first transparent material and defining an attachment surface, an open channel opposite the attachment surface, and a fluid conduit intersecting the open channel and passing through the attachment surface; a tactile layer including a second transparent material and defining a tactile surface, a peripheral region bonded to the attachment surface opposite the tactile surface, and a deformable region adjacent the fluid conduit and disconnected from the attachment surface; a closing panel bonded to the substrate opposite the attachment surface and enclosing the open channel to define a fluid channel; a working fluid; and a displacement device configured to displace the working fluid into the fluid channel and through the fluid conduit to transition the deformable region from a retracted setting to an expanded setting. | 05-19-2016 |
20160139757 | DATA CORRECTION APPARATUS, DISPLAY DEVICE HAVING THE DATA CORRECTION APPARATUS, AND DATA CORRECTION METHOD - A data correction apparatus includes a pattern detector, a scroll detector, and a data processor. The pattern detector detects a predetermined pattern area in raw frame data and outputs pattern data corresponding to the predetermined pattern area. The scroll detector detects a scroll operation based on user manipulation data and outputs scroll data corresponding to the scroll operation. The data processor outputs corrected frame data or non-corrected frame data based on the raw frame data, the pattern data, and the scroll data. The corrected frame data includes first area data and second area data. The first area is data identical to a counterpart of the raw frame data, and the second area data to be obtained by correcting a counterpart of the raw frame data. The non-corrected frame data is identical to the raw frame data. | 05-19-2016 |
20160139762 | ALIGNING GAZE AND POINTING DIRECTIONS - A system is provided herein, comprising a gaze tracking device arranged to detect a direction of a user's gaze; a three dimensional (3D) imaging device arranged to identify a user's pointing finger and a corresponding finger pointing direction; and a processor arranged to compare the detected gaze direction and the identified pointing direction and indicate an alignment therebetween. The system provides a natural user interface which may be used to interact with virtual or actual objects as well as facilitate interaction between communicating users. | 05-19-2016 |
20160140384 | GESTURE RECOGNITION METHOD AND GESTURE RECOGNITION APPARATUS USING THE SAME - A gesture recognition method and an gesture recognition apparatus using the same method are proposed. The method includes the following steps: transmitting an infrared signal, and detecting whether an object appears in a predetermined range based on a first infrared signal retrieved from the infrared signal being reflected; when the object appears in the predetermined range, capturing a plurality of images including the object and a plurality of second infrared signals retrieved from the infrared signal being reflected; and recognizing the images to retrieve a horizontal movement direction of a gesture of the object within the predetermined range, and recognizing the second infrared signals to retrieve a depth movement direction of the gesture within the predetermined range. | 05-19-2016 |
20160140934 | AUTOMATED PERSONALIZED PICTURE FRAME - A digital picture frame including a camera integrated with the frame, and a network connection module allowing the frame for use as a Mobile Positional Social Media (MPSM) device for displaying social media pictures from a user's social media account or her or his community members' social media accounts. The integrated camera is used to automatically determine an identity of a frame viewer, and a viewer profile automatically determined from the identity of the viewer. The displayed photos are automatically shown and/or changed according to the detected viewers. | 05-19-2016 |
20160147294 | Apparatus and Method for Recognizing Motion in Spatial Interaction - Disclosed is a method of recognizing, by a terminal, a motion of a user. The method includes receiving, by a terminal, a plurality of motion inputs, determining an intended motion input from among the plurality of motion inputs received, and performing terminal control corresponding to the intended motion input determined from among the plurality of motion inputs received. Determining the intended motion input includes determining, as the intended motion input, a motion input which is input for a first time after a motion input which is in a stationary state for at least a predetermined time is received. | 05-26-2016 |
20160147296 | METHOD FOR CONTROLLING IMAGE DISPLAY AND APPARATUS SUPPORTING SAME - The disclosure relates to a method of controlling an image display by an electronic device and an apparatus thereof. A method for controlling an image display by an electronic device according to various examples includes: displaying an image; checking a display state of the image; detecting a movement of the electronic device; when the movement of the electronic device is detected, determining a movement variance based on the movement, adjusting a display portion variance based on the display state and the movement variance; and displaying an image having the changed display portion based on the display portion variance. | 05-26-2016 |
20160147297 | PRESENTATION OF DATA ON AN AT LEAST PARTIALLY TRANSPARENT DISPLAY BASED ON USER FOCUS - In one aspect, a device includes a processor, at least one at least partially transparent display accessible to the processor, at least one eye sensor accessible to the processor, and a memory accessible to the processor. The memory bears instructions executable by the processor to present content at a first location on the at least partially transparent display and, based at least in part on data from the eye sensor, determine whether a user is looking one of at least substantially at the first location and past the first location. The instructions are also executable to, in response to a determination that the user is looking past the first location, remove the content from presentation at the first location. | 05-26-2016 |
20160147300 | Supporting Activation of Function of Device - A first apparatus determines whether a user can be assumed to be looking at a device by evaluating data captured by at least one sensor, wherein the apparatus is physically unconnected to the device. In case it is determined that a user can be assumed to be looking at the device, the apparatus causes a transmission of a notification to a second apparatus via a wireless link, as a criterion for the second apparatus to activate a function of the device. The second apparatus monitors whether such a notification is received via a wireless link. The second apparatus activates a predetermined function of the device in case it is determined that such a notification has been received. | 05-26-2016 |
20160147302 | DISPLAY DEVICE AND METHOD OF CONTROLLING THE SAME - The present invention relates to a display device that is capable of being head-mounted and a method of controlling the display device. Provided is a display device including a main body formed in such a manner that the main body can be head-mounted, a display unit arranged in a position that corresponds to left and right eyes configured to display an indicator indicating an occurrence of an event in an external device, and a controller configured to display video information that corresponds to the event that occurs in the external device to the display unit when it is determined that a wearer stares at the indicator for a predetermined time or more. | 05-26-2016 |
20160147303 | Nanotechnology Clothing For Human-Computer Interaction - The present invention discloses nanotechnology clothing in the form of a glove, shirt, pants or suit that can be worn to track the motion of different parts of a user's body. This tracking is utilized to provide the computer system with an immediate input representing an interaction with a computer application or a 3D simulation of the user's body motion. The present invention is used with computers, mobile phones, and head-mounted computer displays serving a variety of gaming, entertainment, sports and medical applications. | 05-26-2016 |
20160147308 | THREE DIMENSIONAL USER INTERFACE - A method of providing a three dimensional (3D) user interface including receiving a user input at least partly from within an input space of the 3D user interface, the input space being associated with a display space of a 3D scene, evaluating the user input relative to the 3D scene, altering the 3D scene based on the user input. A system for providing a three dimensional (3D) user interface including a unit for displaying a 3D scene in a 3D display space, a unit for tracking 3D coordinates of an input object in a 3D input space, a computer for receiving the coordinates of the input object in the 3D input space, and translating the coordinates of the input object in the 3D input space to a user input, and altering the display of the 3D scene based on the user input. Related apparatus and methods are also described. | 05-26-2016 |
20160147309 | Systems and Methods for Performing Multi-Touch Operations on a Head-Mountable Device - Embodiments described herein may provide a configuration of input interfaces used to perform multi-touch operations. An example device may involve: (a) a housing arranged on a head-mountable device, (b) a first input interface arranged on either a superior or an inferior surface of the housing, (c) a second input interface arranged on a surface of the housing that is opposite to the first input interface, and (d) a control system configured to: (1) receive first input data from the first input interface, where the first input data corresponds to a first input action, and in response, cause a camera to perform a first operation in accordance with the first input action, and (2) receive second input data from the second input interface, where the second input data corresponds to a second input action(s) on the second input interface, and in response, cause the camera to perform a second operation. | 05-26-2016 |
20160148558 | PERSONAL DISPLAY SYSTEMS - A personal display system may include a LED panel having a controller, a plurality of displayable patterns being stored in a memory of the controller. The controller may be in communication with an information device, such as a smart phone, such that a communication from the information device causes the controller to display a selected one of the displayable patterns on the LED panel. Communication from the information device may be carried out using gestures. | 05-26-2016 |
20160154459 | Transparent Display Field of View Region Determination | 06-02-2016 |
20160154460 | Gaze Initiated Interaction Technique | 06-02-2016 |
20160154467 | SWITCH OPERATING DEVICE, MOBILE DEVICE AND METHOD FOR OPERATING A SWITCH BY A NON-TACTILE PUSH-GESTURE | 06-02-2016 |
20160154468 | Intraoral User Interface | 06-02-2016 |
20160154473 | ELECTRONIC APPARATUS AND METHOD | 06-02-2016 |
20160154475 | OPTICAL PROXIMITY SENSOR AND ASSOCIATED USER INTERFACE | 06-02-2016 |
20160154476 | APPARATUSES, METHODS AND COMPUTER PROGRAMS FOR REMOTE CONTROL | 06-02-2016 |
20160154519 | METHOD AND SYSTEM FOR CONTROLLING DEVICE | 06-02-2016 |
20160155412 | ELECTRONIC APPARATUS AND METHOD OF CONTROLLING ELECTRONIC APPARATUS | 06-02-2016 |
20160155418 | Terminal and Input Method | 06-02-2016 |
20160155420 | ELECTRONIC APPARATUS AND CONTROLLING METHOD THEREOF | 06-02-2016 |
20160156896 | APPARATUS FOR RECOGNIZING PUPILLARY DISTANCE FOR 3D DISPLAY | 06-02-2016 |
20160162019 | PORTABLE HEALTHCARE DEVICE AND METHOD OF OPERATING THE SAME - A portable healthcare device and a method of operating the same are provided. The portable healthcare device detects biometric information of a user; obtains health state information of the user from the biometric information; and projects an image of the health state information, on a projection surface, in parallel with a reference axis, regardless of an orientation angle of the portable healthcare device. | 06-09-2016 |
20160162022 | WEARABLE WIRELESS HMI DEVICE - A wearable gesture control interface apparatus is used to control a controllable device based on gestures provided by a user. The wearable gesture control interface apparatus includes (i) sensors configured to detect user orientation and movement and generate corresponding sensor data and (ii) a microcontroller configured to: sample the sensor data from the sensors, determine whether the sensor data from one of the sensors meets transmission criteria; and if the sensor data meets the transmission criteria, transmitting control data corresponding to all of the sensors to the controllable device. | 06-09-2016 |
20160162024 | VISUALLY ENHANCED TACTILE FEEDBACK - In an approach for visually enhancing tactile metadata, a computer receives an image on a first computing device. The computer selects an object from one or more objects depicted within the received image. The computer determines boundaries of the selected object. The computer assigns an object tag to the selected object within the determined boundaries, wherein the assigned object tag includes one or more keywords and terms describing the selected object. The computer assigns tactile metadata to the selected object within the determined boundaries based on one or more physical properties associated with the assigned object tag. The computer creates a visually enhanced image based on the assigned tactile metadata, wherein the assigned tactile metadata includes one or more physical properties associated with the assigned object tag capable of being represented visually. | 06-09-2016 |
20160162025 | SYSTEMS AND METHODS FOR CONTROLLING HAPTIC SIGNALS - Systems and methods for controlling a haptic output device include a processor, a haptic peripheral including a haptic output device, and a sensor coupled to the haptic output device. The haptic output device is configured to receive a control signal from the processor and output a haptic effect having a profile to the haptic peripheral in response to the control signal from the processor. The sensor is configured to sense a current operational status of the haptic output device. The processor is configured to generate the control signal for the haptic output device depending on a plurality of inputs including a desired haptic effect waveform and a signal received from the sensor. The inputs may also include at least one parameter of the haptic output device. As such, the control signal causes the profile of the haptic effect to substantially match the desired haptic effect waveform. | 06-09-2016 |
20160162034 | INFORMATION PROCESSING METHOD AND ELECTRONIC DEVICE - The disclosure provides an information processing method and an electronic device using the same. The method comprises: when the electronic device is in a first operation mode in which there is a first angle between a first body and a second body of the electronic device, acquiring first operation information of an operating object, the first operation information including at least a first angle; acquiring a predefined correspondence relationship set; matching the acquired first angle with a first relationship in the relationship set to obtain a first matching result; and outputting the first matching result. The disclosure also provides an electronic device. | 06-09-2016 |
20160162037 | APPARATUS FOR GESTURE RECOGNITION, VEHICLE INCLUDING THE SAME, AND METHOD FOR GESTURE RECOGNITION - A gesture recognition apparatus may execute a command by recognizing a user's gesture. The gesture recognition apparatus includes a gesture sensor that detects a position and movement of an object in space, and a cover that includes a contact surface which is positioned away from the gesture sensor by a predetermined distance and is brought into contact with the object. | 06-09-2016 |
20160162038 | Method For Performing Operation On Intelligent Wearing Device By Using Gesture, And Intelligent Wearing Device - A method for performing an operation on an intelligent wearing device by using a gesture, and an intelligent wearing device is presented. First, an operation gesture is identified according to an operation gesture signal. Then, a control instruction is determined according to the identified operation gesture. Finally, the intelligent wearing device is controlled, according to the control instruction, to perform a corresponding operation. The intelligent wearing device is controlled, according to an operation gesture of a user, to perform a corresponding operation, which ensures accuracy of performing, by the user, the operation on the intelligent wearing device. | 06-09-2016 |
20160162043 | REQUEST FOR CONFIRMATION OF ACTION TO BE TAKEN AFTER DETECTION OF ACTIVITY USING SENSOR DATA OF WIRELESS COMMUNICATION DEVICE (WCD) - Various embodiments of a wireless communication device (WCD) that can be transported by a user are provided. One embodiment comprises a user interface that enables the WCD to receive inputs from the user and to provide outputs to the user. A sensor associated with the WCD produces sensor data. The sensor data indicative of physical movement of the WCD in three dimensional space. Computer program code is executed on the WCD and enables detection of a mobile thing motion activity (MTMA) associated with the user that is transporting the WCD based at least in part upon the sensor data. The code identifies an action to initiate based at least in part upon the detected MTMA, communicates a message to the user that requests confirmation that the action should be initiated, and initiates the action when the user provides the confirmation that the action should be initiated. | 06-09-2016 |
20160163173 | Notification System For Providing Awareness Of An Interactive Surface - A system for providing awareness of an interactive surface is disclosed. The system may include a processor that is communicatively linked to an interactive surface. The processor may determine a position and a velocity of an object that is within range of the interactive surface based on one or more of media content, vibrations, air movement, sounds and, global positioning data associated with the object. Additionally, the processor may determine if the object has a trajectory that would cause the object to collide with the interactive surface based on the information associated with the object. If the processor determines that the object has a trajectory that would cause the object to collide with the interactive surface, the processor can generate a notification. | 06-09-2016 |
20160170211 | WEARABLE ELECTRONIC DEVICE AND METHOD OF CONTROLLING THE SAME | 06-16-2016 |
20160170479 | IMAGE PROCESSING DEVICE, OBJECT SELECTION METHOD AND PROGRAM | 06-16-2016 |
20160170483 | METHOD AND SYSTEM FOR TACTILE-BIASED SENSORY-ENHANCED E-READING | 06-16-2016 |
20160170484 | METHOD AND SYSTEM FOR CUSTOMIZABLE MULTI-LAYERED SENSORY-ENHANCED E-READING INTERFACE | 06-16-2016 |
20160170487 | INFORMATION PROVISION DEVICE AND INFORMATION PROVISION METHOD | 06-16-2016 |
20160170488 | IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM | 06-16-2016 |
20160170489 | ELECTRONIC DEVICE AND DISPLAY METHOD THEREOF | 06-16-2016 |
20160170490 | GESTURE BASED POWER MANAGEMENT SYSTEM AND METHOD | 06-16-2016 |
20160170492 | TECHNOLOGIES FOR ROBUST TWO-DIMENSIONAL GESTURE RECOGNITION | 06-16-2016 |
20160170493 | GESTURE RECOGNITION METHOD IN VEHICLE USING WEARABLE DEVICE AND VEHICLE FOR CARRYING OUT THE SAME | 06-16-2016 |
20160170495 | GESTURE RECOGNITION APPARATUS, VEHICLE HAVING THE SAME, AND METHOD FOR CONTROLLING THE VEHICLE | 06-16-2016 |
20160170496 | GESTURE INFERRED VOCABULARY BINDINGS | 06-16-2016 |
20160170502 | Device for adjusting and self-testing inertial sensors, and method | 06-16-2016 |
20160171277 | METHOD AND SYSTEM FOR VISUALLY-BIASED SENSORY-ENHANCED E-READING | 06-16-2016 |
20160179189 | VEHICLE-MOUNTED EQUIPMENT OPERATING DEVICE | 06-23-2016 |
20160179190 | METHOD AND APPARATUS FOR ESTIMATING POSITION OF PEDESTRIAN WALKING ON LOCOMOTION INTERFACE DEVICE | 06-23-2016 |
20160179192 | PROGRESSIVE PAGE TRANSITION FEATURE FOR RENDERING E-BOOKS ON COMPUTING DEVICES | 06-23-2016 |
20160179194 | DISPLAY DEVICE AND DRIVING METHOD THEREOF | 06-23-2016 |
20160179195 | Method for operating a head-up display, presentation apparatus, vehicle | 06-23-2016 |
20160179197 | METHOD AND SYSTEM FOR INTEGRATING SMART TV PROGRAM CHANNELS WITH APPLICATIONS | 06-23-2016 |
20160179201 | TECHNOLOGIES FOR INTERACTING WITH COMPUTING DEVICES USING HAPTIC MANIPULATION | 06-23-2016 |
20160179205 | SYSTEMS AND METHODS OF DIRECT POINTING DETECTION FOR INTERACTION WITH A DIGITAL DEVICE | 06-23-2016 |
20160179206 | WEARABLE INTERACTIVE DISPLAY SYSTEM | 06-23-2016 |
20160179210 | INPUT SUPPORTING METHOD AND INPUT SUPPORTING DEVICE | 06-23-2016 |
20160179218 | SYSTEMS AND METHODS FOR IMPROVING THE QUALITY OF MOTION SENSOR GENERATED USER INPUT TO MOBILE DEVICES | 06-23-2016 |
20160179220 | CONTROLLING POWER DISTRIBUTION TO HAPTIC OUTPUT DEVICES | 06-23-2016 |
20160179224 | UNDO OPERATION FOR INK STROKE CONVERSION | 06-23-2016 |
20160179279 | POSITION DETECTOR AND POSITION INDICATOR | 06-23-2016 |
20160179461 | TACTILE INPUT PRODUCED SOUND BASED USER INTERFACE | 06-23-2016 |
20160180799 | MULTI-USER NOTIFICATION SYSTEM | 06-23-2016 |
20160180801 | METHOD AND APPARATUS FOR CONTROLLING AN ELECTRONIC DEVICE | 06-23-2016 |
20160180802 | SYSTEMS AND METHODS FOR DETECTION AND MANAGEMENT OF VIEWING CONDITIONS USING MULTI-DEVICE NETWORK | 06-23-2016 |
20160182872 | PROJECTION DISPLAY COMPONENT AND ELECTRONIC DEVICE | 06-23-2016 |
20160187965 | PORTABLE DEVICE AND METHOD OF CONTROLLING THEREFOR - A method of controlling a portable device according to one embodiment of the present specification may include the steps of displaying a first application including a first object and a second object, receiving a first triggering signal indicating that a front direction of the foot wearable device corresponds to a first direction from the foot wearable device, detecting a front direction of the portable device corresponding to a second direction, displaying the first object based on the first direction and displaying the second object based on the second direction. | 06-30-2016 |
20160187966 | HAPTIC SYSTEM FOR ESTABLISHING A CONTACT FREE INTERACTION BETWEEN AT LEAST ONE PART OF A USER'S BODY AND A VIRTUAL ENVIRONMENT - A haptic system for establishing an interaction between at least one part of the body of a user called stimulation zone and a virtual environment comprises means for estimating the position of the stimulation zone; means for emitting, toward the stimulation zone, at least one flow rate-controlled air jet, the air jet being emitted at a point of emission and its flow rate being determined as a function of the estimated position of the stimulation zone in space and of the characteristics of the virtual environment; a robotized structure controlled so as to move the point of emission of the air jet so that said point of emission remains at a constant distance from the stimulation zone. | 06-30-2016 |
20160187968 | ELECTRONIC DEVICE AND FUNCTION CONTROL METHOD THEREOF - An electronic device having a processor, a contact sensor, a blow sensor, and a storage device is disclosed. The processor senses at least one contact position via the contact sensor. The processor senses a blow action via the blow sensor. Then, the processor determines, based on the at least one contact position and/or the blow action, a predefined function and performs the predefined function. | 06-30-2016 |
20160187969 | Methods and Systems for User Interaction within Virtual Reality Scene using Head Mounted Display - Inertial sensors within a head mounted display are used to track movement of the head mounted display. The tracked movement of the head mounted display is correlated to an action within a virtual reality scene that is currently displayed to a user wearing the head mounted display. The action within the virtual reality scene is based on a context of the virtual reality scene that is currently displayed. The detected movement of the head mounted display can be combined with other sensor data, such as gaze detection data, to determine the action within the virtual reality scene. In this manner, movements of the user as detected using the inertial sensors within the head mounted display are used as inputs to cause actions within the current context of the virtual reality scene as displayed to the user within the head mounted display. | 06-30-2016 |
20160187971 | HEAD-BASED TARGETING WITH PITCH AMPLIFICATION - A gaze vector of a human subject is translated to a targeting vector that defines focus within a graphical user interface. Sensor data is received from a sensor system indicating pitch angle of a head of the human subject defining the gaze vector. The pitch angle is translated to a scaled pitch angle according to a pitch scaling function that increases amplification of the pitch angle in one or more directions as the pitch angle exceeds a start angle threshold in each of the one or more directions. The scaled pitch angle is output as a component of the targeting vector. | 06-30-2016 |
20160187972 | APPARATUS, METHOD AND COMPUTER PROGRAM FOR USING GAZE TRACKING INFORMATION - An apparatus, method and computer program, the apparatus comprising: processing circuitry; and memory circuitry including computer program code; the memory circuitry and the computer program code configured to, with the processing circuitry, cause the apparatus at least to perform: obtaining gaze tracking information from a plurality of users wherein the gaze tracking information indicates whether or not a user looked at an object; and analysing the obtained gaze tracking information to categorize objects as different types of objects. | 06-30-2016 |
20160187973 | FLEXIBLE SMART GLOVE - A flexible smart glove detects fine hand and finger motions while permitting the wearer to make hand gestures with dexterity. The flexible smart glove has a thickness of less than about 100 μm and incorporates capacitive micro-sensors positioned at finger joint locations. The micro-sensors are thin film devices built on substrates made of a pliable material such as polyimide. Interdigitated serpentine capacitors monitor strain in the back of the hand, while parallel plate capacitors monitor contact pressure on the palm. Thus the smart glove responds electrically to various types of hand motions. Thin film resistors responsive to changes in body temperature are also formed on the flexible substrate. Motion and temperature data is transmitted from the glove to a microprocessor via a passive RFID tag or an active wireless transmitter. An ASIC is embedded in the smart glove to relay real time sensor data to a remote processor. | 06-30-2016 |
20160187977 | DEFORMABLE HAPTIC WEARABLES WITH VARIABLE PHYSICAL PROPERTIES - A haptically-enabled device has an interface for receiving an instruction to provide haptic information. The device includes a tangible element with a physical property including length, stiffness, or texture. A haptic output device is attached to the tangible element and a haptic response module provides haptic information to the haptic output device. The haptic output device causes the tangible element to be altered from a first state to a second state of the physical property. | 06-30-2016 |
20160187979 | TECHNIQUES FOR DYNAMICALLY CHANGING TACTILE SURFACES OF A HAPTIC CONTROLLER TO CONVEY INTERACTIVE SYSTEM INFORMATION - In one embodiment of the present invention, a haptic engine dynamically configures a haptic controller to provide information regarding the state of an interactive system via sensations of texture. In operation, as the state of the interactive system changes, the haptic engine dynamically updates a haptic state that includes texture characteristics, such as texture patterns, that are designed to reflect the state of the interactive system in an intuitive fashion. The haptic engine then generates signals that configure touch surfaces included in the haptic controller to convey these texture characteristics. Advantageously, providing information regarding the correct state and/or operation of the interactive system based on sensations of texture reduces distractions attributable to many conventional audio and/or visual interactive systems. Notably, in-vehicle infotainment systems that provide dynamically updated texture data increase driving safety compared to conventional in-vehicle infotainment systems that often induce drivers to take their eyes off the road. | 06-30-2016 |
20160187981 | MANUAL FLUID ACTUATOR - A dynamic tactile interface includes a dynamic tactile layer and a manual fluid actuator. The manual fluid actuator includes a displacement device including a bladder, a platen adjacent the bladder, an elongated member coupled to the platen, and a rotary actuator, the elongated member and the rotary actuator translating rotation of the rotary actuator into translation of the platen, the platen compressing the bladder in response to rotation of the rotary actuator in a first direction and expanding the bladder in response to rotation of the rotary actuator in a second direction opposite the first direction. | 06-30-2016 |
20160187987 | AUDIO ENHANCED SIMULATION OF HIGH BANDWIDTH HAPTIC EFFECTS - A system generates haptic effects using at least one actuator and at least one speaker. The system receives a high definition (“HD”) haptic effect signal and a corresponding audio signal if audio is to be played. The system generates a standard definition (“SD”) haptic effect signal based at least on the HD haptic effect signal, and generates an audio based haptic effect signal based at least on the HD haptic effect signal. The system mixes the audio signal and the audio based haptic effect signal, and then substantially simultaneously plays the SD haptic effect signal on the actuator and plays the mixed signal on the speaker. | 06-30-2016 |
20160187988 | Systems and Methods for Haptically-Enabled Holders - One illustrative system disclosed herein includes a processor configured to: receive a signal; determine a haptic effect based at least in part on the signal; and transmit a haptic signal associated with the haptic effect. The system further includes a haptic output device in communication with the processor and coupled to a holder, wherein the holder is configured to mechanically couple with an electronic device. The haptic output device is configured to receive the haptic signal and output the haptic effect. | 06-30-2016 |
20160187990 | METHOD AND APPARATUS FOR PROCESSING GESTURE INPUT - A method and apparatus for processing a gesture input are provided. The method includes determining a type of an elliptical arc corresponding to a gesture input; determining a rotation direction corresponding to the gesture input; and processing the gesture input based on the type of the elliptical arc and the rotation direction. | 06-30-2016 |
20160187991 | RE-ANCHORABLE VIRTUAL PANEL IN THREE-DIMENSIONAL SPACE - The present invention provides a re-anchorable virtual panel in a three-dimensional space, including a detection module, a recognition module and a display module. The detection module includes a finger position detection submodule and a hand feature extraction submodule. The finger position detection submodule extracts the position and direction of a user's fingers, and transmits the estimated information of the finger's position to the recognition module. The hand feature extraction submodule extracts the user's hand features from the input image sequence. After extracting the hand features, the hand feature extraction submodule transmits the information to the recognition module. The recognition module recognizes the position and direction of the user's fingers and hands, and determines the virtual panel's location and direction through the information. The recognition module recognizes hand gestures for controlling the virtual panel. The display module provides virtual feedback when the virtual panel is operated. | 06-30-2016 |
20160187992 | SMART TUTORIAL FOR GESTURE CONTROL SYSTEM - A method includes monitoring a plurality of system inputs, and detecting a behavioral pattern performed by a user and associated with the plurality of system inputs, When the behavioral pattern is detected, the method includes associating, in a memory, a gesture with at least one action, the at least one action being determined by the plurality of system inputs, and, upon detecting the gesture, executing the action associated with the gesture. | 06-30-2016 |
20160187995 | Contextual Based Gesture Recognition And Control - Described are mobile devices and methods that receive location data by a user device that is configured to perform a plurality of different control actions on one or more different remote systems/devices, receive gesture data from a sensor process the location data and the gesture data to determine a command that performs a control action and to determine a particular one of the remote systems/devices on which the command is to be performed, and cause a message that includes the determined command to the determined particular one of the remote systems/devices to perform the determined control action by the particular one of the systems/devices. | 06-30-2016 |
20160188000 | SIGNAL PROCESSING DEVICE AND CONTROL METHOD - A signal processing device and a control method are disclosed. The signal processing device comprises: an input unit which receives a signal from an external input device or senses a connection state; a storage unit for storing information on the external input device; and a control unit for recognizing and activating the external input device on the basis of the signal inputted from the external input device and the stored information. | 06-30-2016 |
20160188002 | Information Processing Method And Electronic Device - An information processing method and an electronic device are provided. The method is applied to an electronic device having flexible display which is capable of N display surfaces, the method includes: obtaining N posture parameters of the N display surfaces; obtaining an object attribute of an i-th display object among M display objects to be displayed, where M≧1 and i is an integer inclusively ranging from 1 to M; determining a display surface for displaying the i-th display object from the N display surfaces, based on the N posture parameters and the object attribute; and displaying the i-th display object on the determined display surface. | 06-30-2016 |
20160188123 | PROJECTION DEVICE - A projection device includes: a detection unit configured to detect a specific object; a projection unit configured to project a first projection image; a drive unit configured to change a direction of the projection unit so as to change the projection position of the first projection image; a controller configured to control the drive unit such that the first projection image is projected while following the motion of the detected specific object; and a communication unit configured to receive information of a second projection image projected by another projection device. The controller acquires information relating to a position of the second projection image through the communication unit, and controls a projection method of the first projection image based on the positions of the first projection image and the second projection image such that the first projection image and the second projection image are not overlapped with each other when they are projected. | 06-30-2016 |
20160188124 | PROJECTION DEVICE AND PROJECTION METHOD - A projection device according to the present disclosure includes a sensor unit that detects an object; a detection unit configured to detect a moving object, a first object, and a second object based on a signal output from the sensor unit; a controller configured to generate a projection image such that a first image corresponding to the first object is projected on a first projection region and a second image corresponding to the second object is projected on a second projection region; and a projection unit projecting the projection image. The controller determines the first image based on a position of the moving object, and determines the second image without depending on a position of the moving object. | 06-30-2016 |
20160188197 | USER TERMINAL DEVICE AND CONTROL METHOD THEREOF - A portable device having a display, the display including a main surface area, a first curved surface area extending from a first side of the main surface area, and a second curved surface area extending from a second side of the main surface area that is opposite the first side, a sensor configured to detect a state of the portable device, and a controller configured to control the display to display a user interface (UI) on one of the first curved surface and the second curved surface based on the state detected by the sensor. | 06-30-2016 |
20160188276 | ELECTRONIC DEVICE, PROCESSING METHOD AND DEVICE - An electronic device, a processing method and a device are provided. The electronic device includes: a first and second display devices, an input device, a first component and a second component, a detection device configured to detect a relative position relation between the two components, a processing device is configured to control the first display device or the second display device in different cases to respond to operation data collected by the input device. | 06-30-2016 |
20160188291 | METHOD, APPARATUS AND COMPUTER PROGRAM PRODUCT FOR INPUT DETECTION - In an example embodiment, a method, apparatus and computer program product are provided. The method includes determining one or more operating conditions of a device. A selection of a mode of operation of the device from at least a first mode and a second mode is facilitated based on the one or more operating conditions of the device. In the first mode, the device is configured to detect an operation input received from an audio source based on two or more audio sensors of the device. In the second mode, the device is configured to detect the operation input based on at least one of the two or more audio sensors and at least one non-audio sensor of the device. | 06-30-2016 |
20160189332 | DEVICE AND METHOD FOR PERFORMING SCHEDULING FOR VIRTUALIZED GRAPHICS PROCESSING UNITS - An example device including virtualized graphics processing units (vGPUs) is configured to respectively receive commands from a plurality of operating systems (OSs). A vGPU scheduler is configured to schedule an order and times for processing of the commands by a GPU. The vGPU scheduler can, for example, schedule the order and times such that a command from a foreground OS (FG OS) among the plurality of OSs is scheduled to be processed first. | 06-30-2016 |
20160189679 | APPARATUS AND METHOD FOR CONTROLLING INTERACTIONS WITH A PORTABLE ELECTRONIC DEVICE - Embodiments disclosed herein generally include a system and a method of controlling a portable electronic device based on the interaction of the portable electronic device with an electronic device, such as a mounting device. Embodiments of the disclosure may include a system and a method of providing information to the portable electronic device that causes the portable electronic device to perform one or more desirable functions or processes based on the portable electronic device's interaction with the mounting device. In some embodiments, the portable electronic device may respond differently when it is caused to interact with differently configured mounting devices. In some embodiments, the way that the portable device interacts with a user may be restricted and/or the functions that the portable device is able to perform may be desirably restricted after the portable device is caused to interact with a mounting device. | 06-30-2016 |
20160195902 | DISPLAY APPARATUS | 07-07-2016 |
20160195917 | Capacitive Proximity Sensor Configuration Including an Antenna Ground Plane | 07-07-2016 |
20160195920 | SOMATOSENSORY RECOGNITION SYSTEM AND RECOGNITION METHOD THEREOF | 07-07-2016 |
20160195922 | WEARABLE APPARATUS, DISPLAY METHOD THEREOF, AND CONTROL METHOD THEREOF | 07-07-2016 |
20160195924 | GAZE-CONTROLLED INTERFACE METHOD AND SYSTEM | 07-07-2016 |
20160195925 | DETERMINATION OF AN OPERATION | 07-07-2016 |
20160195928 | CLOSED LOOP FEEDBACK INTERFACE FOR WEARABLE DEVICES | 07-07-2016 |
20160195930 | FEEDBACK REDUCTION FOR A USER INPUT ELEMENT ASSOCIATED WITH A HAPTIC OUTPUT DEVICE | 07-07-2016 |
20160195933 | NON-CONTACT OPERATION DEVICE FOR BICYCLE | 07-07-2016 |
20160195935 | IDENTIFICATION OF A GESTURE | 07-07-2016 |
20160195938 | BENDABLE USER TERMINAL DEVICE AND METHOD FOR DISPLAYING THEREOF | 07-07-2016 |
20160202724 | Wearable Device Interactive System | 07-14-2016 |
20160202759 | WEARABLE BIOSIGNAL INTERFACE AND OPERATION METHOD OF THE WEARABLE BIOSIGNAL INTERFACE | 07-14-2016 |
20160202766 | GESTURE RECOGNITION METHOD, GESTURE RECOGNITION SYSTEM, TERMINAL DEVICE AND WEARABLE DEVICE | 07-14-2016 |
20160202767 | ELECTRONIC DEVICE, METHOD FOR CONTROLLING ELECTRONIC DEVICE, AND STORAGE MEDIUM | 07-14-2016 |
20160202768 | INFORMATION PROCESSING APPARATUS FOR RECOGNIZING OPERATION INPUT BY GESTURE OF OBJECT AND CONTROL METHOD THEREOF | 07-14-2016 |
20160202771 | CONTROL SYSTEMS AND METHODS FOR HEAD-MOUNTED INFORMATION SYSTEMS | 07-14-2016 |
20160202773 | OPTICAL POINTING SYSTEM | 07-14-2016 |
20160202895 | ENDOSCOPIC IMAGE DISPLAY DEVICE | 07-14-2016 |
20160202944 | MOBILE DEVICE, SYSTEM AND METHOD FOR CONTROLLING A HEADS-UP DISPLAY | 07-14-2016 |
20160202947 | METHOD AND SYSTEM FOR REMOTE VIEWING VIA WEARABLE ELECTRONIC DEVICES | 07-14-2016 |
20160203359 | Wink Gesture Based Control System | 07-14-2016 |
20160203360 | SYSTEMS AND METHODS FOR PERFORMING ACTIONS IN RESPONSE TO USER GESTURES IN CAPTURED IMAGES | 07-14-2016 |
20160203763 | SENSING UNIT, FLEXIBLE DEVICE, AND DISPLAY DEVICE | 07-14-2016 |
20160250054 | System for Providing Intra-Oral Muscular Therapy, and Method of Providing Therapy for Intra-Oral Musculature for a Patient | 09-01-2016 |
20160251826 | ON-BOARD SERVICE TOOL AND METHOD | 09-01-2016 |
20160252954 | CONTROL APPARATUS | 09-01-2016 |
20160252956 | Imaging Method | 09-01-2016 |
20160252957 | GAZE BASED NOTIFICATION REPONSE | 09-01-2016 |
20160252963 | Method and Apparatus for Gesture Detection in an Electronic Device | 09-01-2016 |
20160252965 | Computer Interface for Remotely Controlled Objects and Wearable Articles with Absolute Pose Detection Component | 09-01-2016 |
20160252966 | METHOD BY WHICH EYEGLASS-TYPE DISPLAY DEVICE RECOGNIZES AND INPUTS MOVEMENT | 09-01-2016 |
20160252967 | METHOD AND APPARATUS FOR CONTROLLING SMART DEVICE | 09-01-2016 |
20160252968 | INTERFACE ELEMENTS FOR MANAGING GESTURE CONTROL | 09-01-2016 |
20160252969 | ELECTRONIC DEVICE AND CONTROL METHOD THEREOF | 09-01-2016 |
20160252970 | CONTROL USING MOVEMENTS | 09-01-2016 |
20160252976 | METHOD AND APPARATUS FOR INTERACTIVE USER INTERFACE WITH WEARABLE DEVICE | 09-01-2016 |
20160253044 | SYSTEMS, DEVICES, AND METHODS FOR TOUCH-FREE TYPING | 09-01-2016 |
20160253141 | Wearable Device with Data Privacy Display | 09-01-2016 |
20160378177 | VISUALIZED CONTENT TRANSMISSION CONTROL METHOD, SENDING METHOD AND APPARATUSES THEREOF - A visualized content transmission control method, a visualized content sending method and apparatuses thereof, are provided. A transmission control method comprises: acquiring first information associated with a user gesture and second information associated with a transmission delay of visualized content, and determining a sending strategy of visualized content associated with a target scene at least according to the first information and second information, wherein the sending strategy comprises: sending visualized content associated with the target scene in a direction corresponding to a gesture associated with the delay of a user to the user. By tracking a gesture change of the user viewing an immersive virtual reality display and transmission delay change of the visualized content, the visualized content can be intelligently sent in a corresponding direction, which is favorable for providing better immersive virtual reality experience for the user and reducing pressure caused to a network. | 12-29-2016 |
20160378178 | VISUALIZED CONTENT TRANSMISSION CONTROL METHOD, SENDING METHOD AND APPARATUSES THEREOF - A visualized content transmission control method, a visualized content sending method and apparatuses thereof are provided. A transmission control method comprises: acquiring information associated with a user gesture; and determining a sending strategy of visualized content associated with a target scene at least according to the information associated with the user gesture, wherein the sending strategy comprises: sending visualized content associated with the target scene in a direction corresponding to the user gesture to the user. By tracking a gesture change of the user viewing an immersive virtual reality display, the visualized content can be intelligently sent in a corresponding direction, which is favorable for providing better immersive virtual reality experience for the user and reducing pressure caused to a network. | 12-29-2016 |
20160378179 | AUTOMATED PERIPHERAL DEVICE HANDOFF BASED ON EYE TRACKING - Systems, apparatuses and methods may provide for identifying a plurality of computing systems proximate to a peripheral device and determining a gaze location of a user. Additionally, the peripheral device may be automatically connected to a first computing system in the plurality of computing systems based on the gaze location of the user. In one example, a change in the gaze location may be detected, wherein the peripheral device is automatically connected to a second computing system in the plurality of computing systems based on the change in the gaze location. | 12-29-2016 |
20160378182 | DISPLAY OF INFORMATION ON A HEAD MOUNTED DISPLAY - A method comprising precluding display of information on a head mounted display worn by a user, receiving information indicative of an eye orientation of the user, receiving information indicative of a head orientation of the user, determining that a difference between the eye orientation and a centered eye orientation exceeds a threshold eye orientation difference, determining that a difference between the head orientation and an anatomical position head orientation exceeds a threshold head orientation difference, and causing display of a representation of information on the head mounted display based, at least in part, on the determination that the eye orientation exceeds the threshold eye orientation difference from the centered eye orientation and the determination that the head orientation exceeds the threshold head orientation difference from the anatomical position head orientation is disclosed. | 12-29-2016 |
20160378185 | INTEGRATION OF HEADS UP DISPLAY WITH DATA PROCESSING - A wearable information gathering and processing system is described. The system includes an information obtaining device, the information obtaining device including at least one of a radio frequency identification (RFID) reader, an infra red (IR) detector, a global positioning system (GPS) receiver, a laser measurement device, microphone, or camera. The system also includes a processing device, the processing device including at least one of a voice recognition processor, a gesture recognition processor, or data processor, and an information-providing device coupled to the processing device, the information-providing device including at least one of a heads up display, a speaker, or a vibrator. | 12-29-2016 |
20160378186 | TECHNOLOGIES FOR CONTROLLING HAPTIC FEEDBACK INTENSITY - Technologies for adjusting haptic feedback intensity are described. In some embodiments the technologies leverage contextual information detected or otherwise provided by a sensor of an electronic device to determine an adjusted haptic feedback intensity. A control message may be issued to one or more haptic devices, and may be configured to cause the haptic device(s) to produce haptic feedback in accordance with the adjusted haptic feedback intensity. Devices, methods, and computer readable media utilizing such technologies are also described. | 12-29-2016 |
20160378191 | SLIM PROFILE MAGNETIC USER INTERFACE DEVICES - Slim profile magnetic user interface devices (slim UIDs) are disclosed. A slim UID may include a slim profile housing, a movable actuator assembly having user contact surfaces on opposite sides, along with a magnet, magnetic sensor, restoration element, and processing element. User mechanical interaction with the actuator element may be sensed by the magnetic sensor and processed to generate output signals usable by a coupled electronic computing system. | 12-29-2016 |
20160378192 | DEVICE AND METHOD OF OPERATING A CONTROLLABLE ELECTRONIC DEVICE - A device may include a near field communication antenna, a monitor, a comparator, and a trigger. The near field communication antenna may be configured to detect an external near field communication device. The monitor may be configured to monitor a series of one or more transitions in a detection state of the device between a first detection state and a second detection state based on whether the near field communication antenna detects the near field communication device. The comparator may be configured to perform a comparison between first timing information associated with the series of one or more transitions and second timing information associated with a first predefined series of one or more transitions. The trigger may be configured to initiate a predefined action of the device based on the comparison. | 12-29-2016 |
20160378193 | Wearable Device with Gesture Recognition Mechanism - A method is described including receiving real time signals from the sensor array, comparing the real time signals to corresponding training signals representing a gesture from each sensor in the sensor array to determine if a gesture has been recognized and transmitting a command corresponding to a recognized gesture to a remote computing device. | 12-29-2016 |
20160378203 | FOLDABLE ELECTRONIC APPARATUS HAVING DISPLAY PANEL WITH VARIABLE CURVATURE - A foldable electronic apparatus includes a flexible display panel, a first cover and a second cover configured to support a rear surface of the flexible display panel and be interconnected by a hinge member, and a first slider and a second slider configured to be installed on the first cover and the second cover and vary a curvature of the flexible display panel by an operation of sliding along the first cover and the second cover, respectively. | 12-29-2016 |
20160378204 | SYSTEM FOR TRACKING A HANDHELD DEVICE IN AN AUGMENTED AND/OR VIRTUAL REALITY ENVIRONMENT - A system for tracking a first electronic device, such as a handheld electronic device, in a virtual reality environment generated by a second electronic device, such as a head mounted display may include the fusion of data collected by sensors of the electronic device with data collected by sensors of the head mounted display, together with data collected by a front facing camera of the electronic device related to the front face of the head mounted display. | 12-29-2016 |
20170235363 | Method and System for Calibrating an Eye Tracking System | 08-17-2017 |
20170235364 | DETECTION DEVICE, DETECTION METHOD, CONTROL DEVICE, AND CONTROL METHOD | 08-17-2017 |
20170235366 | DOMINANT LIMB IDENTIFICATION METHOD AND DEVICE | 08-17-2017 |
20170235375 | USER INTERFACE, A MEANS OF TRANSPORTATION AND A METHOD FOR CLASSIFYING A USER GESTURE PERFORMED FREELY IN SPACE | 08-17-2017 |
20170235381 | Information Processing Device | 08-17-2017 |
20170235382 | SYSTEM AND METHOD FOR MOTION PROCESSING IN MOBILE DEVICES | 08-17-2017 |
20170236149 | GENERATING CONTENT FOR A VIRTUAL REALITY SYSTEM | 08-17-2017 |
20170236319 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM | 08-17-2017 |
20180024621 | BIOLOGICAL INFORMATION PROCESSING SYSTEM, SERVER SYSTEM, AND INFORMATION PROCESSING METHOD | 01-25-2018 |
20180024623 | DETECTING USER RANGE OF MOTION FOR VIRTUAL REALITY USER INTERFACES | 01-25-2018 |
20180024625 | UTILIZING INERTIAL MEASUREMENT UNIT DATA OF A HEAD MOUNTED DISPLAY DEVICE TO GENERATE INFERRED COMFORT DATA | 01-25-2018 |
20180024627 | AUTOMATIC PAUSE AND RESUME OF MEDIA CONTENT DURING AUTOMATED DRIVING BASED ON DRIVER'S GAZE | 01-25-2018 |
20180024628 | BEAM STEERING BACKLIGHT UNIT AND HOLOGRAPHIC DISPLAY APPARATUS INCLUDING THE SAME | 01-25-2018 |
20180024630 | EDITING CUTS IN VIRTUAL REALITY | 01-25-2018 |
20180024631 | Interactive Display System with Eye Tracking to Display Content According to Subject's Interest | 01-25-2018 |
20180024632 | Interactive Display System with Eye Tracking to Display Content According to Subject's Interest | 01-25-2018 |
20180024633 | Using Eye Tracking to Display Content According to Subject's Interest in an Interactive Display System | 01-25-2018 |
20180024634 | METHODS AND APPARATUS FOR INFERRING USER INTENT BASED ON NEUROMUSCULAR SIGNALS | 01-25-2018 |
20180024635 | METHODS AND APPARATUS FOR PREDICTING MUSCULO-SKELETAL POSITION INFORMATION USING WEARABLE AUTONOMOUS SENSORS | 01-25-2018 |
20180024642 | NO-HANDED SMARTWATCH INTERACTION TECHNIQUES | 01-25-2018 |
20180024643 | Gesture Based Interface System and Method | 01-25-2018 |
20180024718 | SYSTEMS FOR AND METHODS OF PROVIDING INERTIAL SCROLLING AND NAVIGATION USING A FINGERPRINT SENSOR | 01-25-2018 |
20180024799 | Apparatus, Methods and Computer Programs for Providing Images | 01-25-2018 |
20180025679 | TRANSPARENT DISPLAY DEVICE | 01-25-2018 |
20180025688 | DYNAMIC MERCHANDISING COMMUNICATION SYSTEM | 01-25-2018 |
20180027230 | Adjusting Parallax Through the Use of Eye Movements | 01-25-2018 |
20190146577 | SIMULATING AND EVALUATING SAFE BEHAVIORS USING VIRTUAL REALITY AND AUGMENTED REALITY | 05-16-2019 |
20190146583 | WEARABLE WIRELESS HMI DEVICE | 05-16-2019 |
20190146584 | LINE-OF-SIGHT DETECTION APPARATUS | 05-16-2019 |
20190146587 | HAPTIC THEME FRAMEWORK | 05-16-2019 |
20190146592 | DISPLAY DEVICE | 05-16-2019 |
20190146599 | VIRTUAL/AUGMENTED REALITY MODELING APPLICATION FOR ARCHITECTURE | 05-16-2019 |
20190146740 | MOTION-ACTIVATED DISPLAY OF MESSAGES ON AN ACTIVITY MONITORING DEVICE | 05-16-2019 |
20190147654 | APPARATUS AND METHOD FOR PROVIDING VIRTUAL REALITY CONTENT OF MOVING MEANS | 05-16-2019 |
20190147791 | DISPLAY DEVICE, CONTROL METHOD, AND NON-TRANSITORY COMPUTER-READABLE RECORDING MEDIUM | 05-16-2019 |
20220137701 | SYSTEMS AND METHODS FOR CONTROLLING SECONDARY DEVICES USING MIXED, VIRTUAL OR AUGMENTED REALITY - Disclosed are embodiments for systems and methods for controlling secondary devices at fine scales using mixed, virtual and/or augmented reality. Examples of secondary devices may include those involved in lighting (i.e., light emitting diodes), sound, and the production of videos, film, and movies. In some embodiments, a system may include a server, mixed reality user device and secondary device communicatively coupled via a network. The server may be configured to generate a virtual object based on gestural data that is configured for display within a mixed reality environment. The server may also be configured to generate secondary device settings based on gestural data, where the secondary device settings may be used to control the operation of a secondary device. | 05-05-2022 |
20220137702 | DETECTION OF FACIAL EXPRESSIONS - An apparatus comprising means for: receiving information from at least one inertial measurement unit configured to be worn on a user's head; and causing, at least in part, determining facial expression information in dependence on at least the received information. | 05-05-2022 |
20220137705 | HEAD MOUNTED DISPLAY APPARATUS - Provided is a technique in which, in arranging a virtual object with respect to a real space, a user has less trouble, usability is good, and the object is suitably placeable. A head mounted display apparatus (HMD apparatus) according to one embodiment has a function of arranging and displaying the virtual object in a space based on an operation by a user. The HMD apparatus displays, on a display surface, a grid including a plurality of points for supporting an operation of the virtual object, and disposes and displays, according to an operation includes designation of a target virtual object and designation of a first point at an arrangement destination, the target virtual object at the position of the first point. | 05-05-2022 |
20220137711 | IMAGE PROCESSING SYSTEM AND IMAGE PROCESSING DEVICE - An image processing system and an image processing device are provided. The image processing system includes an electronic device and the image processing device. The image processing device is connected to the electronic device. The image processing device displays a floating three-dimensional input device image information. The image processing device interacts with an object through the three-dimensional input device image information to generate a plurality of control signals, and transmits the plurality of control signals to the electronic device. | 05-05-2022 |
20220137712 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM - The present disclosure relates to an information processing apparatus, an information processing method, and a program capable of, when accepting hand-based operation inputs from a plurality of users, performing appropriate pairing of left and right hands for each user. | 05-05-2022 |
20220137714 | ELECTRONIC DEVICE AND CONTROL METHOD THEREOF - An electronic device is provided. The electronic device includes a touch module, a motion sensor, a memory, and a control unit. The touch module is configured to generate a touch signal. The motion sensor is configured to detect motion of the electronic device to generate motion data. The memory stores a preset motion condition. The control unit is electrically connected to the touch module, the motion sensor, and the memory, and configured to: receive the motion data; and determine whether the motion data meets the preset motion condition or not, and generate a virtual touch signal when the motion data meets the preset motion condition. A control method applied to the electronic device is further provided. | 05-05-2022 |
20220137715 | Method and Apparatus for Controlling Onboard System - A method and an apparatus for controlling an onboard system includes determining, using a sensor mounted in a steering wheel, a track of a current gesture of a driver in a process of holding, by the driver, the steering wheel; determining a type of the current gesture from preset gesture types based on the track of the current gesture; and controlling the onboard system to perform an operation corresponding to the type of the current gesture. | 05-05-2022 |
20220137724 | MULTI-MODAL HAND LOCATION AND ORIENTATION FOR AVATAR MOVEMENT - Examples of systems and methods for improved hand tracking of a user in a mixed reality environment are disclosed. The systems and methods may be configured to estimate the hand pose and shape of a user's hands for applications such as animating a hand on a user's avatar. Data from multiple sources, such as a totem internal measurement unit (“IMU”), external totem location tracking, vision cameras, and depth sensors, may be manipulated using a set of rules that are based on historical data, ergonomics data, and motion data. | 05-05-2022 |
20220137737 | ELECTRONIC PRODUCT AND TOUCH-SENSING DISPLAY MODULE THEREOF INCLUDING SLOT IN BENDING PORTION OF FILM SENSING STRUCTURE - A touch-sensing display module is provided, including a display unit and a film sensing structure. The display unit has a front surface, a rear surface opposite the front surface, and a side connecting the front surface and the rear surface. The film sensing structure is attached to the display unit and includes a main body and a bending portion connected to the main body, wherein the bending portion includes a slot. The bending portion extends from the front surface of the display unit to the rear surface along the side. The bending portion covers a portion of the side and a portion of the rear surface. A portion of the display unit is exposed by the slot. | 05-05-2022 |
20220137739 | CONDUCTIVE BONDING STRUCTURE FOR SUBSTRATES AND DISPLAY DEVICE INCLUDING THE SAME - A substrate conductive bonding structure includes a lower substrate including a connection pad exposed to outside the lower substrate, an upper substrate including a transfer pad overlapping the connection pad, exposed to outside the upper substrate and including an upper surface, and a slit defined in the transfer pad, overlapping the connection pad and open at the upper surface of the transfer pad, the slit including an extending portion extending along a first direction and having a first slit width along a second direction crossing the first direction, and an expansion portion connected to the extending portion and having a second slit width along the second direction which is larger than the first slit width, and a solder contacting the upper surface of the connection pad, extending to the upper surface of the transfer pad and into the slit. | 05-05-2022 |
20220137741 | ELECTRODE STRUCTURE COMBINED WITH ANTENNA AND DISPLAY DEVICE INCLUDING THE SAME - An electrode structure combined with an antenna according to an embodiment of the present disclosure includes a substrate layer, sensing electrodes, a bridge electrode and an antenna unit. The sensing electrodes are arranged on the substrate layer, and include first sensing electrodes arranged along a first direction parallel to a top surface of the substrate layer, and second sensing electrodes arranged along a second direction parallel to the top surface of the substrate layer and intersecting the first direction. The bridge electrode connects the first sensing electrodes neighboring in the first direction. The antenna unit is disposed at the same layer as that of the bridge electrode, and includes a radiator overlapping the sensing electrodes in a planar view. | 05-05-2022 |