Patent application number | Description | Published |
20100242274 | DETECTING TOUCH ON A CURVED SURFACE - Embodiments are disclosed herein that are related to input devices with curved multi-touch surfaces. For example, in one disclosed embodiment, a method of making a multi-touch input device having a curved touch-sensitive surface comprises forming on a substrate an array of sensor elements defining a plurality of pixels of the multi-touch sensor, forming the substrate into a shape that conforms to a surface of the curved geometric feature of the body of the input device, and fixing the substrate to the curved geometric feature of the body of the input device. | 09-30-2010 |
20100245246 | DETECTING TOUCH ON A CURVED SURFACE - Embodiments are disclosed herein that are related to input devices with curved multi-touch surfaces. One disclosed embodiment comprises a touch-sensitive input device having a curved geometric feature comprising a touch sensor, the touch sensor comprising an array of sensor elements integrated into the curved geometric feature and being configured to detect a location of a touch made on a surface of the curved geometric feature. | 09-30-2010 |
20100302344 | ESTABLISHING EYE CONTACT IN VIDEO CONFERENCING - A video-conferencing system includes a display panel configured to form a display image for viewing by a local video conferencer, a camera configured to acquire a facial image of the local video conferencer and having an aperture oriented toward the display panel, and an array of partly reflective facets aligned in parallel against a plane disposed parallel to and forward of the display panel, each facet positioned to transmit a portion of the display image from the display panel through that facet to the local video conferencer, and to reflect a portion of the facial image of the local video conferencer to the aperture. | 12-02-2010 |
20100315335 | Pointing Device with Independently Movable Portions - A pointing device with independently movable portions is described. In an embodiment, a pointing device comprises a base unit and a satellite portion. The base unit is arranged to be located under a palm of a user's hand and be movable over a supporting surface. The satellite portion is arranged to be located under a digit of the user's hand and be independently movable over the supporting surface relative to the base unit. In embodiments, data from at least one sensing device is read, and movement of both the base unit and the independently movable satellite portion of the pointing device is calculated from the data. The movement of the base unit and the satellite portion is analyzed to detect a user gesture. | 12-16-2010 |
20100315336 | Pointing Device Using Proximity Sensing - A pointing device using proximity sensing is described. In an embodiment, a pointing device comprises a movement sensor and a proximity sensor. The movement sensor generates a first data sequence relating to sensed movement of the pointing device relative to a surface. The proximity sensor generates a second data sequence relating to sensed movement relative to the pointing device of one or more objects in proximity to the pointing device. In embodiments, data from the movement sensor of the pointing device is read and the movement of the pointing device relative to the surface is determined. Data from the proximity sensor is also read, and a sequence of sensor images of one or more objects in proximity to the pointing device are generated. The sensor images are analyzed to determine the movement of the one or more objects relative to the pointing device. | 12-16-2010 |
20110080341 | Indirect Multi-Touch Interaction - Indirect multi-touch interaction is described. In an embodiment, a user interface is controlled using a cursor and a touch region comprising a representation of one or more digits of a user. The cursor and the touch region are moved together in the user interface in accordance with data received from a cursor control device, such that the relative location of the touch region and the cursor is maintained. The representations of the digits of the user are moved in the touch region in accordance with data describing movement of the user's digits. In another embodiment, a user interface is controlled in a first mode of operation using an aggregate cursor, and switched to a second mode of operation in which the aggregate cursor is divided into separate portions, each of which can be independently controlled by the user. | 04-07-2011 |
20110099348 | CONTROLLING MEMORY VISIBILITY - Embodiments are disclosed herein that are related to controlling the visibility of a portion of memory in a hardware device. For example, one disclosed embodiment provides a hardware device comprising a communications interface configured to connect to a complementary communications interface on a computing device. The hardware device further comprises a portion of memory having a first ID configured to cause the portion of memory to be visible to a user of the computing device to which the hardware device is connected. Further still, the hardware device comprises instructions stored in the portion of memory which are executable by and transferable to the computing device to cause the installation of a computer program related to the hardware device, and to cause the portion of memory to be hidden from the user of the computing device upon transferring of the instructions to the computing device. | 04-28-2011 |
20110227947 | Multi-Touch User Interface Interaction - Multi-touch user interface interaction is described. In an embodiment, an object in a user interface (UI) is manipulated by a cursor and a representation of a plurality of digits of a user. At least one parameter, which comprises the cursor location in the UI, is used to determine that multi-touch input is to be provided to the object. Responsive to this, the relative movement of the digits is analyzed and the object manipulated accordingly. In another embodiment, an object in a UI is manipulated by a representation of a plurality of digits of a user. Movement of each digit by the user moves the corresponding representation in the UI, and the movement velocity of the representation is a non-linear function of the digit's velocity. After determining that multi-touch input is to be provided to the object, the relative movement of the representations is analyzed and the object manipulated accordingly. | 09-22-2011 |