Patent application number | Description | Published |
20100165287 | Display Apparatus and Device - A near-eye display related apparatus and device is disclosed. The apparatus includes a housing configured to receive an optical engine, and a light guiding plate attached to the housing and configured to receive light representing an image from the optical engine, the light guiding plate includes a first diffractive grating adapted to incouple the light into the light guiding plate, and a second diffractive grating adapted to outcouple the light from the light guiding plate such that the light is received by an eye of a user wearing the apparatus. The light guiding plate has a contact surface portion configured to optically couple the light guiding plate to a transparent plate, The contact surface portion being adapted to be in physical contact with the transparent plate. | 07-01-2010 |
20100302164 | Method and Device For Character Input - A device which has a display, a memory, a processor and a first input and a second input. The device is configured to receive input through said first corresponding to a base character component and to receive input through said second corresponding to a supplemental character component. The supplemental character component and the base character component are thereby combined to form a character input. | 12-02-2010 |
20110032213 | Multimode Apparatus and Method for Making Same - An apparatus operable in touch-input mode and scanning mode uses a light guide and a camera to look through a slanted facet at one corner of the light guide. The light guide has an upper surface for placing a touch object or an item for scanning. An angular sensitive grating is located on lower surface having fringes such that when a light beam directed from a location P on upper surface toward lower surface at a predefined direction, it is diffracted by the grating and guided toward the slanted facet. The diffracted beam exits the slanted facet at an exiting angle indicative of the location P. When an object or item is placed on upper surface, it causes changes in exit beam intensity. From exiting angles and intensity changes, a camera is able to locate touch objects or to acquire an image of the scanned item. | 02-10-2011 |
20110105194 | Image Navigation - There is disclosed a method in a mobile communications device, wherein the method comprises displaying a main image, selecting an object in the main image, displaying a plurality of object images comprising the selected object, selecting a target object image from the plurality of object images, and displaying a target candidate image associated with the target object image. | 05-05-2011 |
20120056804 | Apparatus, Methods And Computer Program Products Providing Finger-Based And Hand-Based Gesture Commands For Portable Electronic Device Applications - A method includes executing a gesture with a user-manipulated physical object in the vicinity of a device; generating data that is descriptive of the presence of the user-manipulated object when executing the gesture; and interpreting the data as pertaining to at least one object, such as an object displayed by the device. | 03-08-2012 |
20120280900 | GESTURE RECOGNITION USING PLURAL SENSORS - Apparatus comprises a processor; a user interface enabling user interaction with one or more software applications associated with the processor; first and second sensors configured to detect, and generate signals corresponding to, objects located within respective first and second sensing zones remote from the apparatus, wherein the sensors are configured such that their respective sensing zones overlap spatially to define a third, overlapping, zone in which both the first and second sensors are able to detect a common object; and a gesture recognition system for receiving signals from the sensors, the gesture recognition system being responsive to detecting an object inside the overlapping zone to control a first user interface function in accordance with signals received from both sensors. | 11-08-2012 |
20120281129 | CAMERA CONTROL - Apparatus has at least one processor and at least one memory having computer-readable code stored thereon which when executed controls the at least one processor: to receive gestural data representing a user gesture made independently of any touch-based input interface of a device; to identify from the gestural data a corresponding camera command associated with the user gesture; and to output the identified camera command to control the at least one camera. | 11-08-2012 |
20130120399 | METHOD, APPARATUS, COMPUTER PROGRAM AND USER INTERFACE - A method, apparatus, computer program and user interface wherein the method comprises displaying a still image on a display; detecting user selection of a portion of the still image; and in response to the detection of the user selection, replacing the selected portion of the image with a moving image and maintaining the rest of the still image, which has not been selected, as a still image. | 05-16-2013 |