Patent application number | Description | Published |
20090276734 | Projection of Images onto Tangible User Interfaces - A surface computing device is described which has a surface which can be switched between transparent and diffuse states. When the surface is in its diffuse state, an image can be projected onto the surface and when the surface is in its transparent state, an image can be projected through the surface and onto an object. In an embodiment, the image projected onto the object is redirected onto a different face of the object, so as to provide an additional display surface or to augment the appearance of the object. In another embodiment, the image may be redirected onto another object. | 11-05-2009 |
20100145920 | Digital Media Retrieval and Display - Retrieval and display of digital media items is described. For example, the digital media items may be photographs, videos, audio files, emails, text documents or parts of these. In an embodiment a dedicated apparatus having a touch display screen is provided in a form designed to look like a domestic fish tank. In an embodiment graphical animated agents are depicted on the display as fish whose motion varies according to at least one behavior parameter which is pseudo random. In embodiments, the agents have associated search criteria and when a user selects one or more agents the associated search criteria are used in a retrieval operation to retrieve digital media items from a store. In some embodiments media items are communicated between the apparatus and a portable communications device using a communications link established by tapping the portable device against the media retrieval and display apparatus. | 06-10-2010 |
20100149182 | Volumetric Display System Enabling User Interaction - A volumetric display system which enables user interaction is described. In an embodiment, the system consists of a volumetric display and an optical system. The volumetric display creates a 3D light field of an object to be displayed and the optical system creates a copy of the 3D light field in a position away from the volumetric display and where a user can interact with the image of the object displayed. In an embodiment, the optical system involves a pair of parabolic mirror portions. | 06-17-2010 |
20100242274 | DETECTING TOUCH ON A CURVED SURFACE - Embodiments are disclosed herein that are related to input devices with curved multi-touch surfaces. For example, in one disclosed embodiment, a method of making a multi-touch input device having a curved touch-sensitive surface comprises forming on a substrate an array of sensor elements defining a plurality of pixels of the multi-touch sensor, forming the substrate into a shape that conforms to a surface of the curved geometric feature of the body of the input device, and fixing the substrate to the curved geometric feature of the body of the input device. | 09-30-2010 |
20100245246 | DETECTING TOUCH ON A CURVED SURFACE - Embodiments are disclosed herein that are related to input devices with curved multi-touch surfaces. One disclosed embodiment comprises a touch-sensitive input device having a curved geometric feature comprising a touch sensor, the touch sensor comprising an array of sensor elements integrated into the curved geometric feature and being configured to detect a location of a touch made on a surface of the curved geometric feature. | 09-30-2010 |
20100265178 | CAMERA-BASED MULTI-TOUCH MOUSE - Technologies for a camera-based multi-touch input device operable to provide conventional mouse movement data as well as three-dimensional multi-touch data. Such a device is based on an internal camera focused on a mirror or set of mirrors enabling the camera to image the inside of a working surface of the device. The working surface allows light to pass through. An internal light source illuminates the inside of the working surface and reflects off of any objects proximate to the outside of the device. This reflected light is received by the mirror and then directed to the camera. Imaging from the camera can be processed to extract touch points corresponding to the position of one or more objects outside the working surface as well as to detect gestures performed by the objects. Thus the device can provide conventional mouse functionality as well as three-dimensional multi-touch functionality. | 10-21-2010 |
20100302199 | FERROMAGNETIC USER INTERFACES - Ferromagnetic user interfaces are described. In embodiments, user interface devices are described that can detect the location of movement on a user-touchable portion by sensing movement of a ferromagnetic material. In some embodiments sensors are arranged in a two dimensional array, and the user interface device can determine the location of the movement in a plane substantially parallel to the two-dimensional array and the acceleration of movement substantially perpendicular to the two-dimensional array. In other embodiments, user interface devices are described that can cause a raised surface region to be formed on a ferrofluid layer of a user-touchable portion, which is detectable by the touch of a user. Embodiments describe how the raised surface region can be moved on the ferrofluid layer. Embodiments also describe how the raised surface region can be caused to vibrate. | 12-02-2010 |
20100315335 | Pointing Device with Independently Movable Portions - A pointing device with independently movable portions is described. In an embodiment, a pointing device comprises a base unit and a satellite portion. The base unit is arranged to be located under a palm of a user's hand and be movable over a supporting surface. The satellite portion is arranged to be located under a digit of the user's hand and be independently movable over the supporting surface relative to the base unit. In embodiments, data from at least one sensing device is read, and movement of both the base unit and the independently movable satellite portion of the pointing device is calculated from the data. The movement of the base unit and the satellite portion is analyzed to detect a user gesture. | 12-16-2010 |
20100315336 | Pointing Device Using Proximity Sensing - A pointing device using proximity sensing is described. In an embodiment, a pointing device comprises a movement sensor and a proximity sensor. The movement sensor generates a first data sequence relating to sensed movement of the pointing device relative to a surface. The proximity sensor generates a second data sequence relating to sensed movement relative to the pointing device of one or more objects in proximity to the pointing device. In embodiments, data from the movement sensor of the pointing device is read and the movement of the pointing device relative to the surface is determined. Data from the proximity sensor is also read, and a sequence of sensor images of one or more objects in proximity to the pointing device are generated. The sensor images are analyzed to determine the movement of the one or more objects relative to the pointing device. | 12-16-2010 |
20100315413 | Surface Computer User Interaction - Surface computer user interaction is described. In an embodiment, an image of a user's hand interacting with a user interface displayed on a surface layer of a surface computing device is captured. The image is used to render a corresponding representation of the hand. The representation is displayed in the user interface such that the representation is geometrically aligned with the user's hand. In embodiments, the representation is a representation of a shadow or a reflection. The process is performed in real-time, such that movement of the hand causes the representation to correspondingly move. In some embodiments, a separation distance between the hand and the surface is determined and used to control the display of an object rendered in a 3D environment on the surface layer. In some embodiments, at least one parameter relating to the appearance of the object is modified in dependence on the separation distance. | 12-16-2010 |
20110080341 | Indirect Multi-Touch Interaction - Indirect multi-touch interaction is described. In an embodiment, a user interface is controlled using a cursor and a touch region comprising a representation of one or more digits of a user. The cursor and the touch region are moved together in the user interface in accordance with data received from a cursor control device, such that the relative location of the touch region and the cursor is maintained. The representations of the digits of the user are moved in the touch region in accordance with data describing movement of the user's digits. In another embodiment, a user interface is controlled in a first mode of operation using an aggregate cursor, and switched to a second mode of operation in which the aggregate cursor is divided into separate portions, each of which can be independently controlled by the user. | 04-07-2011 |
20110210917 | User Interface Control Using a Keyboard - User interface control using a keyboard is described. In an embodiment, a user interface displayed on a display device is controlled using a computer connected to a keyboard. The keyboard has a plurality of alphanumeric keys that can be used for text entry. The computer receives data comprising a sequence of key-presses from the keyboard, and generates for each key-press a physical location on the keyboard. The relative physical locations of the key-presses are compared to calculate a movement path over the keyboard. The movement path describes the path of a user's digit over the keyboard. The movement path is mapped to a sequence of coordinates in the user interface, and the movement of an object displayed in the user interface is controlled in accordance with the sequence of coordinates. | 09-01-2011 |
20110214053 | Assisting Input From a Keyboard - Assisting input from a keyboard is described. In an embodiment, a processor receives a plurality of key-presses from the keyboard comprising alphanumeric data for input to application software executed at the processor. The processor analyzes the plurality of key-presses to detect at least one predefined typing pattern, and, in response, controls a display device to display a representation of at least a portion of the keyboard in association with a user interface of the application software. In another embodiment, a computer device has a keyboard and at least one sensor arranged to monitor at least a subset of keys on the keyboard, and detect an object within a predefined distance of a selected key prior to activation of the selected key. The processor then controls the display device to display a representation of a portion of the keyboard comprising the selected key. | 09-01-2011 |
20110227947 | Multi-Touch User Interface Interaction - Multi-touch user interface interaction is described. In an embodiment, an object in a user interface (UI) is manipulated by a cursor and a representation of a plurality of digits of a user. At least one parameter, which comprises the cursor location in the UI, is used to determine that multi-touch input is to be provided to the object. Responsive to this, the relative movement of the digits is analyzed and the object manipulated accordingly. In another embodiment, an object in a UI is manipulated by a representation of a plurality of digits of a user. Movement of each digit by the user moves the corresponding representation in the UI, and the movement velocity of the representation is a non-linear function of the digit's velocity. After determining that multi-touch input is to be provided to the object, the relative movement of the representations is analyzed and the object manipulated accordingly. | 09-22-2011 |
20110252163 | Integrated Development Environment for Rapid Device Development - An integrated development environment for rapid device development is described. In an embodiment the integrated development environment provides a number of different views to a user which each relate to a different aspect of device design, such as hardware configuration, software development and physical design. The device, which may be a prototype device, is formed from a number of objects which are selected from a database and the database stores multiple data types for each object, such as a 3D model, software libraries and code-stubs for the object and hardware parameters. A user can design the device by selecting different views in any order and can switch between views as they choose. Changes which are made in one view, such as the selection of a new object, are fed into the other views. | 10-13-2011 |
20120038891 | Projection of Images onto Tangible User Interfaces - The techniques described herein provide a surface computing device that includes a surface layer configured to be in a transparent state and a diffuse state. In the diffuse state, an image can be projected onto the surface. In the transparent state, an image can be projected through the surface. | 02-16-2012 |
20120072626 | Automatic Addressing Protocol for a Shared Bus - An automatic addressing protocol for a shared bus is described. In an embodiment, devices connected in a chain by a shared bus are also connected by an independent electrical connection between each pair of neighboring devices. A protocol is used over the independent electrical connections which is independent of that used on the shared bus. Devices in the chain receive at least one device ID from an upstream neighbor via the independent electrical connection and either use this received ID as their ID or use the received ID to compute their ID. Where the device has a downstream neighbor, a device then transmits at least one device ID to the downstream neighbor via the independent electrical connection and this transmitted ID may be their ID or an ID generated based on their ID, for example, by incrementing the ID by one. The process is repeated by devices along the chain. | 03-22-2012 |
20120081275 | Media Display Device - A media display device is described. In an embodiment the media display device comprises a display screen and at least one loudspeaker held in a housing rotatably mounted on a lid. For example, in a one handed operation a user is able to rotate the housing to open the device and reveal the display screen which is held upwards using the lid as a stand. For example, the action of opening the device is detected by a sensor and triggers the device to randomly select an item of media content and to display that. For example, images, audio clips, contacts or other items that a user has not opened for some time are presented. The device may randomly select the media type in some embodiments. In an example the sensor is provided by a rotary encoder which also provides part of a hinge for mounting the housing and lid. | 04-05-2012 |
20120105312 | User Input Device - A user input device is described. In an embodiment the user input device is hand held and comprises a sensing strip to detect one-dimensional motion of a user's finger or thumb along the sensing strip and to detect position of a user's finger or thumb on the sensing strip. In an embodiment the sensed data is used for cursor movement and/or text input at a master device. In an example the user input device has an orientation sensor and orientation of the device influences orientation of a cursor. For example, a user may move the cursor in a straight line in the pointing direction of the cursor by sliding a finger or thumb along the sensing strip. In an example, an alphabetical scale is displayed and a user is able to zoom into the scale and select letters for text input using the sensing strip. | 05-03-2012 |
20120139841 | User Interface Device With Actuated Buttons - A user interface device with actuated buttons is described. In an embodiment, the user interface device comprises two or more buttons and the motion of the buttons is controlled by actuators under software control such that their motion is inter-related. The position or motion of the buttons may provide a user with feedback about the current state of a software program they are using or provide them with enhanced user input functionality. In another embodiment, the ability to move the buttons is used to reconfigure the user interface buttons and this may be performed dynamically, based on the current state of the software program, or may be performed dependent upon the software program being used. The user interface device may be a peripheral device, such as a mouse or keyboard, or may be integrated within a computing device such as a games device. | 06-07-2012 |
20120139897 | Tabletop Display Providing Multiple Views to Users - A tabletop display providing multiple views to users is described. In an embodiment the display comprises a rotatable view-angle restrictive filter and a display system. The display system displays a sequence of images synchronized with the rotation of the filter to provide multiple views according to viewing angle. These multiple views provide a user with a 3D display or with personalized content which is not visible to a user at a sufficiently different viewing angle. In some embodiments, the display comprises a diffuser layer on which the sequence of images are displayed. In further embodiments, the diffuser is switchable between a diffuse state when images are displayed and a transparent state when imaging beyond the surface can be performed. The device may form part of a tabletop comprising with a touch-sensitive surface. Detected touch events and images captured through the surface may be used to modify the images being displayed. | 06-07-2012 |
20120309531 | SENSING FLOOR FOR LOCATING PEOPLE AND DEVICES - A sensing floor to locate people and devices is described. In an embodiment, the sensing floor (or sensing surface), is formed from a flexible substrate on which a number of distributed sensing elements and connections between sensing elements are formed in a conductive material. In an example, these elements and connections may be printed onto the flexible substrate. The sensing floor operates in one or more modes in order to detect people in proximity to the floor. In passive mode, the floor detects signals from the environment, such as electric hum, which are coupled into a sensing element when a person stands on the sensing element. In active mode, one sensing element transmits a signal which is detected in another sensing element when a person bridges those two elements. In hybrid mode, the floor switches between passive and active mode, for example, on detection of a person in passive mode. | 12-06-2012 |
20130007192 | DEVICE SENSOR AND ACTUATION FOR WEB PAGES - An embedded device sensor and actuation web page access system and method for providing a web application (such as a web page) access to sensor data about an embedded device and access to actuation mechanisms (such as vibration) associated with the device. The system and method can use the sensor data to obtain context information about the embedded device and understand what a user of the device is doing at any given moment. The sensor data can be used by the web application to influence how content is served up to the user. In some embodiments, the sensor data is provided to the web server using the headers in HTTP requests. Moreover, actuation commands for actuation mechanisms on the embedded device are provided using the headers of HTTP responses. Embodiments of the system and method provide a website access to sensor data and actuation commands without changing website operation. | 01-03-2013 |
20130007700 | CODE SUGGESTIONS - Code suggestion technique embodiments are presented that improve the productivity of a programmer by assisting in both the writing of code and in debugging the code as it is being written. In general, this is accomplished by automating a search of a database of the past work and problem solving activities of programmers to make suggestion to a programmer currently writing code. For example, as a programmer enters code, suggested ways of finishing a line or code section are presented based on how previous programmers finished a similar line or code section. Another example involves a programmer who encounters an error message while writing code. In such a case, the programmer is provided with a suggested fix or fixes, based on the actions taken by previous developers when encountering a similar problem. | 01-03-2013 |
20130275957 | CUSTOMIZING APPLIANCES - Configuring appliances is described. In an embodiment an appliance, for example, a domestic appliance can have both a physical user interface, for example the buttons on the appliance and a remote user interface. In various embodiments a user can access the remote user interface by connecting to a network interface associated with the appliance using a client device. The client device can display the remote user interface of the appliance and the user can use the remote user interface to configure settings and functions of the appliance and of the physical and remote user interfaces. In various embodiments the remote user interface can be used in combination with the physical user interface. In various embodiments the remote user interface is a development environment which enables the user to change the functionality of the appliance by altering or replacing program files which are executed at the appliance. | 10-17-2013 |
20130290910 | USER INTERFACE CONTROL USING A KEYBOARD - User interface control using a keyboard is described. In an embodiment, a user interface displayed on a display device is controlled using a computer connected to a keyboard. The keyboard has a plurality of alphanumeric keys that can be used for text entry. The computer receives data comprising a sequence of key-presses from the keyboard, and generates for each key-press a physical location on the keyboard. The relative physical locations of the key-presses are compared to calculate a movement path over the keyboard. The movement path describes the path of a user's digit over the keyboard. The movement path is mapped to a sequence of coordinates in the user interface, and the movement of an object displayed in the user interface is controlled in accordance with the sequence of coordinates. | 10-31-2013 |
20140206451 | RECONFIGURABLE CLIP-ON MODULES FOR MOBILE COMPUTING DEVICES - A set of reconfigurable clip-on modules for mobile computing devices includes two or more modules and at least one of the modules has an input button or other control and at least one of the modules can communicate with the computing device without needing to be connected to it via a wire. The input button is mapped to a user input in a program, such as a game, which is running or displayed on the computing device to which the modules are clipped. In an embodiment, user inputs via the buttons or other controls on the clip-on modules are mapped to user inputs in a game running on the device, which may be a touch-screen device, and the mapping between user inputs via the buttons and user inputs in the game may change dependent upon the game being played, user preference, or other criteria. | 07-24-2014 |
20140244715 | INTERACTION BETWEEN DEVICES DISPLAYING APPLICATION STATUS INFORMATION - Methods and apparatus for displaying dynamic status information on a plurality of devices and enabling interactions between these devices are described. In an embodiment, a trigger signal is sent to one or more computing devices to trigger the launch of an application client on the computing device. The trigger signal is generated on another device in response to a user interacting with the displayed status information. This other device may be an impoverished device which displays status information for an application but is not capable of running the application client. In various embodiments, the status information is displayed in the form of a GUI element called a tile and this status information may be pushed to the device by a proxy server. The trigger signal may be sent to multiple devices or in some embodiments, a computing device may be selected to receive the trigger signal. | 08-28-2014 |
20150033167 | RECONFIGURABLE CLIP-ON MODULES FOR MOBILE COMPUTING DEVICES - A set of reconfigurable clip-on modules for mobile computing devices includes two or more modules and at least one of the modules has an input button or other control and at least one of the modules can communicate with the computing device without needing to be connected to it via a wire. The input button is mapped to a user input in a program, such as a game, which is running or displayed on the computing device to which the modules are clipped. In an embodiment, user inputs via the buttons or other controls on the clip-on modules are mapped to user inputs in a game running on the device, which may be a touch-screen device, and the mapping between user inputs via the buttons and user inputs in the game may change dependent upon the game being played, user preference, or other criteria. | 01-29-2015 |
20150057784 | OPTIMIZING 3D PRINTING USING SEGMENTATION OR AGGREGATION - 3D printing may be optimized by segmenting input jobs and/or combining parts of input jobs together. In an embodiment, a user-defined metric is received associated with each input job and this is used in scheduling input jobs to optimize latency and/or throughput of the 3D printing process, along with the printing envelope and other characteristics of the 3D printers used. In various embodiments, the scheduling may comprise dividing a 3D object into a number of parts and then scheduling these parts separately and/or combining 3D objects, or parts of 3D objects, from various input jobs to be printed at the same time on the same 3D printer. In various embodiments, the scheduling is repeated when a new input job is received and changes made during printing. In various embodiments, a user may submit an updated version of an input job which is already in the process of being printed. | 02-26-2015 |