Patent application number | Description | Published |
20090002217 | Touchpad-enabled remote controller and user interaction methods - The handheld case of the remote control unit includes at least one touchpad, and other sensors, such as acceleration sensors, case perimeter sensors, pressure sensors, RF signal sensors. These sensors provide a rich array of sensory inputs that are classified by a pattern recognizer to generate control commands for both the consumer electronic equipment and the remote control unit itself. A power management system to conserve unit battery power is also responsive to the pattern recognizer to allow intelligent power management control. The control system uses the display of the consumer electronic equipment to provide instructions to the user, and the behavior of the remote control system uses what is displayed on the display as context information for pattern recognition. | 01-01-2009 |
20090006101 | Method to detect and assist user intentions with real time visual feedback based on interaction language constraints and pattern recognition of sensory features - A language model back-off system can be used with a user interface employing one or more language models to constrain navigation of selectable user interface input components. A user input interpretation module receives user input and interprets the user input to determine if a selection is made of one or more user interface input components. If a selection is not made, the user input interpretation module determines whether conditions are met for backing off one or more language models employed to constrain navigation of the user interface input components. If the conditions are met, a language model back-off module backs off the one or more language models. | 01-01-2009 |
20090007001 | Virtual keypad systems and methods - Accordingly, a virtual keypad system for inputting text is provided. A virtual keypad system includes a remote controller having at least one touchpad incorporated therein and divided into a plurality of touch zones. A display device is in data communication with the remote controller and is operable to display a user interface including a keypad, where each key of the keypad is mapped to a touch zone of the touchpad. A prediction module, in response to an operator pressing a given touch zone to select a particular character, performs one or more key prediction methods to predict one or more next plausible keys. A key mapping module remaps the touch zones of the touchpad to the keys of the keypad based on the one or more next plausible keys. | 01-01-2009 |
20090064008 | USER INTERACTION FOR CONTENT BASED STORAGE AND RETRIEVAL - A graphic user interface system for use with a content based retrieval system includes an active display having display areas. For example, the display areas include a main area providing an overview of database contents by displaying representative samples of the database contents. The display areas also include one or more query areas into which one or more of the representative samples can be moved from the main area by a user employing gesture based interaction. A query formulation module employs the one or more representative samples moved into the query area to provide feedback to the content based retrieval system. | 03-05-2009 |
20100127995 | SYSTEM AND METHOD FOR DIFFERENTIATING BETWEEN INTENDED AND UNINTENDED USER INPUT ON A TOUCHPAD - A method and system for differentiating between intended user input and inadvertent or incidental contact with a touchpad is herein disclosed. When a user engages the touchpad, sensors on the touchpad are activated and generate touch sensor signals. Based on the pattern of engaged sensors, a hand pattern can be determined. From the hand pattern, a hand model may be retrieved. The hand model may indicate passive zones and active zones. Contact in the active zones may be considered intentional, while contact in the passive zones may be considered unintended or incidental. Moreover, a global shift may be calculated, and input from the active zones may be compensated for the global shift. The input from the active zones can then be used to control a graphical user interface. | 05-27-2010 |
20100164897 | VIRTUAL KEYPAD SYSTEMS AND METHODS - Accordingly, a virtual keypad system for inputting text is provided. A virtual keypad system includes a remote controller having at least one touchpad incorporated therein and divided into a plurality of touch zones. A display device is in data communication with the remote controller and is operable to display a user interface including a keypad, where each key of the keypad is mapped to a touch zone of the touchpad. A prediction module, in response to an operator pressing a given touch zone to select a particular character, performs one or more key prediction methods to predict one or more next plausible keys. A key mapping module remaps the touch zones of the touchpad to the keys of the keypad based on the one or more next plausible keys. | 07-01-2010 |
20100195872 | SYSTEM AND METHOD FOR IDENTIFYING OBJECTS IN AN IMAGE USING POSITIONAL INFORMATION - A computer-implemented method is provided for identifying objects in an image. The method includes: capturing a series of images of a scene using a camera; receiving a topographical map for the scene that defines distances between objects in the scene; determining distances between objects in the scene from a given image; approximating identities of objects in the given image by comparing the distances between objects as determined from the given image in relation to the distances between objects from the map. The identities of objects can be re-estimated using features of the objects extracted from the other images. | 08-05-2010 |
20110010648 | VISUAL FEEDBACK BASED ON INTERACTION LANGUAGE CONSTRAINTS AND PATTERN RECOGNITION OF SENSORY FEATURES - A language model back-off system can be used with a user interface employing one or more language models to constrain navigation of selectable user interface input components. A user input interpretation module receives user input and interprets the user input to determine if a selection is made of one or more user interface input components. If a selection is not made, the user input interpretation module determines whether conditions are met for backing off one or more language models employed to constrain navigation of the user interface input components. If the conditions are met, a language model back-off module backs off the one or more language models. | 01-13-2011 |
20110018817 | TOUCHPAD-ENABLED REMOTE CONTROLLER AND USER INTERACTION METHODS - The hand held case of the remote control unit includes at least one touchpad, and other sensors, such as acceleration sensors, case perimeter sensors, pressure sensors, RF signal sensors. These sensors provide a rich array of sensory inputs that are classified by a pattern recognizer to generate control commands for both the consumer electronic equipment and the remote control unit itself. A power management system to conserve unit battery power is also responsive to the pattern recognizer to allow intelligent power management control. The control system uses the display of the consumer electronic equipment to provide instructions to the user, and the behavior of the remote control system uses what is displayed on the display as context information for pattern recognition. | 01-27-2011 |
20110043475 | METHOD AND SYSTEM OF IDENTIFYING A USER OF A HANDHELD DEVICE - A system and method for identifying a user of a handheld device is herein disclosed. The device implementing the method and system may attempt to identify a user based on signals that are incidental to a user's handling of the device. The signals are generated by a variety of sensors dispersed along the periphery or within the housing. The sensors range may include touch sensors, inertial sensors, acoustic sensors, pulse oximiters, and a touchpad. Based on the sensors and corresponding signals, identification information is generated. The identification information is used to identify the user of the handheld device. The handheld device may implement various statistical learning and data mining techniques to increase the robustness of the system. The device may also authenticate the user based on the user drawing a circle, or other shape. | 02-24-2011 |
20110219105 | SYSTEM AND METHOD FOR APPLICATION SESSION CONTINUITY - A first device pairs with a second device via local communication and then transfers a server-implemented application session from the first device to the second device by saving as session data the current state of the first application, and the state of the application-related data being consumed. This session data is then transferred to the second device, which then runs a second application based on the state supplied by the session data. One or both of the devices communicates a session transfer request to the server, causing the server to re-route application-related data to the second device to be consumed by the second application at the state where consumption by the first application was transferred. | 09-08-2011 |
20110234502 | PHYSICALLY RECONFIGURABLE INPUT AND OUTPUT SYSTEMS AND METHODS - Systems and methods for altering the shape of a reconfigurable surface area are presented. The present systems and methods facilitate efficient and effective interaction with a device or system. In one embodiment, a surface reconfiguration system includes a flexible surface; an elevation unit that creates alterations in the contours of the surface; and an elevation control component that controls adjustments to the elevation unit. Thus, the surface of the device is reconfigurable based on system, application, mode, and/or user needs. Accordingly, the surface can be used to provide input and output functionality. The surface can include touch detection functionality for added input functionality. | 09-29-2011 |
20120162073 | APPARATUS FOR REMOTELY CONTROLLING ANOTHER APPARATUS AND HAVING SELF-ORIENTATING CAPABILITY - A remote control apparatus for communicating with a target device includes: a sensing portion for sensing points of user contact with the apparatus, user gestures, and an acceleration value of the apparatus; a transmitting device for sending signals representative of user commands to the target device; a controller; and a memory including instructions for configuring the controller to perform a self-orientation process based upon at least one of the acceleration value and the points of user contact to determine a forward direction of a plane of operation for defining the user gestures. An axis of the determined plane of operation substantially intersects the apparatus at any angle. | 06-28-2012 |
20120194324 | DIRECTION AND HOLDING-STYLE INVARIANT, SYMMETRIC DESIGN, AND TOUCH- AND BUTTON-BASED REMOTE USER INTERACTION DEVICE - A remote control unit selectively transmits a control signal for remotely controlling an electronic device. The unit defines an imaginary cut plane that substantially bisects the unit. The unit includes a plurality of input features collectively disposed symmetrically with respect to the imaginary cut plane. The input features include a first and second input feature. The first and second input features are disposed on opposite sides of the cut plane. Furthermore, the unit includes a sensor that detects a first and second holding position of the unit. The first holding position and the second holding position are substantially opposite to each other. Moreover, the unit includes a controller that associates the control signal with the first input feature when the sensor detects the first holding position, and the controller associates the control signal with the second input feature when the sensor detects the second holding position. | 08-02-2012 |
20130030645 | AUTO-CONTROL OF VEHICLE INFOTAINMENT SYSTEM BASED ON EXTRACTED CHARACTERISTICS OF CAR OCCUPANTS - An infotainment system is provided for delivering content to multiple occupants of a vehicle. The infotainment system includes: an occupant detector configured to receive characteristic data for occupants of the vehicle and generate a profile for each occupant of the vehicle; a recommendation engine that analyzes the profiles of the vehicle occupants; and a content delivery engine that deliver content to one or more of the vehicle occupants in accordance with the analysis of the profiles of the vehicle occupants. | 01-31-2013 |
20130030811 | NATURAL QUERY INTERFACE FOR CONNECTED CAR - Sensors within the vehicle monitor driver movement, such as face and head movement to ascertain the direction a driver is looking, and gestural movement to ascertain what the driver may be pointing at. This information is combined with video camera data taken of the external vehicle surroundings. The apparatus uses these data to assist the speech dialogue processor to disambiguate phrases uttered by the driver. The apparatus can issue informative responses or control vehicular functions based on queries automatically generated based on the disambiguated phrases. | 01-31-2013 |
20130038437 | SYSTEM FOR TASK AND NOTIFICATION HANDLING IN A CONNECTED CAR - The vehicular notification and control apparatus receives user input via a multimodal control system, optionally including touch-responsive control and non-contact gestural and speech control. A processor-controlled display provides visual notifications of notifications and tasks according to a dynamically prioritized queue which takes into account environmental conditions and driving context and available driver attention. The display is filtered to present only valid notifications and tasks for the current available driver attention level. Driver attention is determined using multiple, diverse sensors integrated through a sensor fusion mechanism. | 02-14-2013 |
20130093674 | Hybrid Pointing System and Method - A handheld controller which includes at least two disparate sensors, such as a motion sensor and a touchpad sensor. A processor deployed in either handheld controller or separate product implements a hybrid pointing and selection method that uses data from the first sensor to adjust the sensitivity to stimulus of the second sensor, and vice versa. The respective sensor data are thus tempered and combined to generate a cursor control signal that includes a large scale control component to control size and movement of a rough pointer region, and a fine scale control component to control position of a precise pointer within the rough pointer region. | 04-18-2013 |
20130338865 | Augmented Battery and Ecosystem for Electric Vehicles - An augmented battery for providing power to a powered vehicle includes a case adapted for removably mounting on the vehicle that encloses: at least one cell disposed within the case that provides power to the vehicle; a power port supported by the case that is adapted to electrically couple the at least one cell to the vehicle; and a processor disposed within the case and having an associated memory that stores program code executable by the processor to selectively communicate through an electronic communication circuit. | 12-19-2013 |
20140062875 | MOBILE DEVICE WITH AN INERTIAL MEASUREMENT UNIT TO ADJUST STATE OF GRAPHICAL USER INTERFACE OR A NATURAL LANGUAGE PROCESSING UNIT, AND INCLUDING A HOVER SENSING FUNCTION - A mobile device has an inertial measurement unit (IMU) that senses linear and rotational movement, a touch screen including (i) a touch-sensitive surface and (ii) a 3D sensing unit, and a state change determination module that determines state changes from a combination of (i) an output of the IMU and (ii) the 3D sensing unit sensing the hovering object. The mobile device may include a pan/zoom module. A mobile device may include a natural language processing (NLP) module that predicts a next key entry based on xy positions of keys so far touched, xy trajectory of the hovering object and NLP statistical modeling. A graphical user interface (GUI) visually highlights a predicted next key and presents a set of predicted words arranged around the current key above which the object is hovering as selectable buttons to enable entry of a complete word from the set of predicted words. | 03-06-2014 |
20140118254 | INFORMATION INPUT APPARATUS AND METHOD FOR CONTROLLING INFORMATION INPUT APPARATUS - The handheld controller controls a graphic cursor on a display. A motion sensor responds to user movement of a first type producing motion data. A touchpad responds to user movement of a second type more precise than the first type producing second sensor data. A processor calculates a hybrid cursor movement signal having a large scale movement component corresponding to movement of the first type and a fine scale movement component corresponding to movement of the second type, the hybrid cursor movement signal being for moving the graphic cursor. A processor calculates the large scale movement component based on a first sensitivity parameter representing sensitivity of the motion sensor determined by associating the touchpad data with the motion data, and the fine scale movement component based on a second sensitivity parameter representing sensitivity of the touchpad determined by associating the motion data with the touchpad data. | 05-01-2014 |
20140129725 | SmartLight Interaction System - The interfacing computer system mediates connections between user devices and peripheral devices. The system includes a platform server computer having an input/output port that supports communication with a plurality of devices using the IP protocol. The platform server computer is programmed to provide a peripheral device registration function whereby information about a peripheral device is stored in the associated memory. The platform server computer is further programmed to provide a user device authentication function whereby information about a user device is stored in the associated memory and accessed by the at least one processor to mediate how a registered peripheral device may be accessed by said user device. The platform server computer also provides an information routing function whereby source information originating from a first device is routed through the input/output port to a device other than the first device and according to instructions provided to the platform server computer by a user device. | 05-08-2014 |
20140139426 | SmartLight Interaction System - The conference room automation apparatus employs a processor-based integrated movement sensor, lights, cameras, and display device, such as a projector, that senses and interprets human movement within the room to control the projector in response to that movement and that captures events occurring in the room. Preferably packed in a common integrated package, the apparatus employs a layered software/hardware architecture that may be readily extended as a platform to support additional third-party functionality. | 05-22-2014 |
20140293017 | Method of Automatically Forming One Three-Dimensional Space with Multiple Screens - Multiple electronic displays are arranged to define an ensemble of the user's choosing. The processor generates and displays a uniquely encoded emblem encoding a unique identifier for that display. A camera captures at least one image of the ensemble so that the encoded emblem for each display is captured. The captured images are then processed to extract the following attributes: a) unique identifier of the display; b) position of the encoded emblem relative to the reference coordinate system; and c) pointing direction of the encoded emblem relative to the reference coordinate system. The processor collectively processes the attributes for each of the emblem images to compute for each display a transformation matrix that selects a portion of the image data for display on that monitor and performs pointing direction correction so that the information presented on that display appears spatially consistent with the information presented on the other displays. | 10-02-2014 |
20140333562 | APPARATUS FOR REMOTELY CONTROLLING ANOTHER APPARATUS AND HAVING SELF-ORIENTATING CAPABILITY - An apparatus for communicating with a target device includes: a sensing portion for sensing points of user contact with the apparatus and user gestures; a transmitting device for sending signals representative of user commands to the target device; a controller; and a memory including instructions for configuring the controller to perform a self-orientation process to determine a forward direction of a plane of operation for defining the user gestures. | 11-13-2014 |