Patent application number | Description | Published |
20100157053 | Autonomous Recall Device - An autonomous recall device is described. A camera is movably mounted in a housing so that the camera, in some embodiments, may automatically pan and tilt. One or more environmental sensors are incorporated, such as microphones in some embodiments. In other embodiments sensors are provided physically separate from the recall device but in communication with the recall device. At least one processor in the device controls the movement and actuation of the camera according to conditions monitored by the sensor(s). Also an attention device is provided in the recall device and is controlled by the microprocessor. In an embodiment the attention device comprises light emitting diodes, actuation of the pan and tilt mechanisms, and optionally opening and closing of a cover concealing the camera. The recall device may be perceived as having its own “character”. Captured content may be retrieved via a web service in some embodiments. | 06-24-2010 |
20100242274 | DETECTING TOUCH ON A CURVED SURFACE - Embodiments are disclosed herein that are related to input devices with curved multi-touch surfaces. For example, in one disclosed embodiment, a method of making a multi-touch input device having a curved touch-sensitive surface comprises forming on a substrate an array of sensor elements defining a plurality of pixels of the multi-touch sensor, forming the substrate into a shape that conforms to a surface of the curved geometric feature of the body of the input device, and fixing the substrate to the curved geometric feature of the body of the input device. | 09-30-2010 |
20100245246 | DETECTING TOUCH ON A CURVED SURFACE - Embodiments are disclosed herein that are related to input devices with curved multi-touch surfaces. One disclosed embodiment comprises a touch-sensitive input device having a curved geometric feature comprising a touch sensor, the touch sensor comprising an array of sensor elements integrated into the curved geometric feature and being configured to detect a location of a touch made on a surface of the curved geometric feature. | 09-30-2010 |
20100265178 | CAMERA-BASED MULTI-TOUCH MOUSE - Technologies for a camera-based multi-touch input device operable to provide conventional mouse movement data as well as three-dimensional multi-touch data. Such a device is based on an internal camera focused on a mirror or set of mirrors enabling the camera to image the inside of a working surface of the device. The working surface allows light to pass through. An internal light source illuminates the inside of the working surface and reflects off of any objects proximate to the outside of the device. This reflected light is received by the mirror and then directed to the camera. Imaging from the camera can be processed to extract touch points corresponding to the position of one or more objects outside the working surface as well as to detect gestures performed by the objects. Thus the device can provide conventional mouse functionality as well as three-dimensional multi-touch functionality. | 10-21-2010 |
20100315335 | Pointing Device with Independently Movable Portions - A pointing device with independently movable portions is described. In an embodiment, a pointing device comprises a base unit and a satellite portion. The base unit is arranged to be located under a palm of a user's hand and be movable over a supporting surface. The satellite portion is arranged to be located under a digit of the user's hand and be independently movable over the supporting surface relative to the base unit. In embodiments, data from at least one sensing device is read, and movement of both the base unit and the independently movable satellite portion of the pointing device is calculated from the data. The movement of the base unit and the satellite portion is analyzed to detect a user gesture. | 12-16-2010 |
20100315336 | Pointing Device Using Proximity Sensing - A pointing device using proximity sensing is described. In an embodiment, a pointing device comprises a movement sensor and a proximity sensor. The movement sensor generates a first data sequence relating to sensed movement of the pointing device relative to a surface. The proximity sensor generates a second data sequence relating to sensed movement relative to the pointing device of one or more objects in proximity to the pointing device. In embodiments, data from the movement sensor of the pointing device is read and the movement of the pointing device relative to the surface is determined. Data from the proximity sensor is also read, and a sequence of sensor images of one or more objects in proximity to the pointing device are generated. The sensor images are analyzed to determine the movement of the one or more objects relative to the pointing device. | 12-16-2010 |
20110007078 | Creating Animations - Animation creation is described, for example, to enable children to create, record and play back stories. In an embodiment, one or more children are able to create animation components such as characters and backgrounds using a multi-touch panel display together with an image capture device. For example, a graphical user interface is provided at the multi-touch panel display to enable the animation components to be edited. In an example, children narrate a story whilst manipulating animation components using the multi-touch display panel and the sound and visual display is recorded. In embodiments image analysis is carried out automatically and used to autonomously modify story components during a narration. In examples, various types of handheld view-finding frames are provided for use with the image capture device. In embodiments saved stories can be restored from memory and retold from any point with different manipulations and narration. | 01-13-2011 |
20110080341 | Indirect Multi-Touch Interaction - Indirect multi-touch interaction is described. In an embodiment, a user interface is controlled using a cursor and a touch region comprising a representation of one or more digits of a user. The cursor and the touch region are moved together in the user interface in accordance with data received from a cursor control device, such that the relative location of the touch region and the cursor is maintained. The representations of the digits of the user are moved in the touch region in accordance with data describing movement of the user's digits. In another embodiment, a user interface is controlled in a first mode of operation using an aggregate cursor, and switched to a second mode of operation in which the aggregate cursor is divided into separate portions, each of which can be independently controlled by the user. | 04-07-2011 |
20110298689 | Device for Sharing Photographs in Social Settings - A device for sharing photographs in social settings is described. In an example, the device comprises a display surface which extends around a vertical axis of the device such that it provides a cumulative viewing angle of greater than 180°. This enables viewers located all around the device to see images displayed. The display surface may be a continuous display or may be formed from multiple discrete displays. The images displayed comprise sets of related images which may, for example, be accessed from an online image store (such as a social networking site) or other storage device. In an example, sets of images may be displayed in the form of filmstrips, with each filmstrip comprising a set of related images associated with a different user. Where the device includes a user interaction element, detection of a user interaction changes the images that are displayed. | 12-08-2011 |
20110302522 | Sketching and Searching Application for Idea Generation - A sketching and searching application for idea generation is described. In an embodiment, a software application is described which has a user interface which comprises a sketching area. When a user draws or annotates a sketch in the application, the application automatically searches for images based on the sketch and displays results in the form of images outside the sketching area. These images are used to inspire new ideas and to facilitate the creative process in a way that is closely linked with the sketching process. When the sketch is updated, additional searching is automatically performed and new results are displayed. In some examples the sketching area is deformable and deformation may cause new results to be displayed and in some examples the user is able to drag image results into the sketching area to enable tracing of the image or to include the image in the sketch. | 12-08-2011 |
20120081275 | Media Display Device - A media display device is described. In an embodiment the media display device comprises a display screen and at least one loudspeaker held in a housing rotatably mounted on a lid. For example, in a one handed operation a user is able to rotate the housing to open the device and reveal the display screen which is held upwards using the lid as a stand. For example, the action of opening the device is detected by a sensor and triggers the device to randomly select an item of media content and to display that. For example, images, audio clips, contacts or other items that a user has not opened for some time are presented. The device may randomly select the media type in some embodiments. In an example the sensor is provided by a rotary encoder which also provides part of a hinge for mounting the housing and lid. | 04-05-2012 |
20120105312 | User Input Device - A user input device is described. In an embodiment the user input device is hand held and comprises a sensing strip to detect one-dimensional motion of a user's finger or thumb along the sensing strip and to detect position of a user's finger or thumb on the sensing strip. In an embodiment the sensed data is used for cursor movement and/or text input at a master device. In an example the user input device has an orientation sensor and orientation of the device influences orientation of a cursor. For example, a user may move the cursor in a straight line in the pointing direction of the cursor by sliding a finger or thumb along the sensing strip. In an example, an alphabetical scale is displayed and a user is able to zoom into the scale and select letters for text input using the sensing strip. | 05-03-2012 |
20130162653 | Creating Animations - Animation creation is described, for example, to enable children to create, record and play back stories. In an embodiment, one or more children are able to create animation components such as characters and backgrounds using a multi-touch panel display together with an image capture device. For example, a graphical user interface is provided at the multi-touch panel display to enable the animation components to be edited. In an example, children narrate a story whilst manipulating animation components using the multi-touch display panel and the sound and visual display is recorded. In embodiments image analysis is carried out automatically and used to autonomously modify story components during a narration. In examples, various types of handheld view-finding frames are provided for use with the image capture device. In embodiments saved stories can be restored from memory and retold from any point with different manipulations and narration. | 06-27-2013 |