Patent application number | Description | Published |
20080281851 | Archive for Physical and Digital Objects - Existing tools for organizing family memories offer few possibilities for easily integrating both physical and digital materials in order to produce a single archive for a family (or other group of users). This also applies to archiving of physical objects and digital media in general (even for applications outside the field of family use). An archiving system is described which incorporates at least one image capture device, a display, a sensing apparatus arranged to detect user input associated with the display, a processor and memory, and a receptacle for holding digital media storage devices such as mobile telephones, digital cameras, personal digital assistants and the like. The image capture device is operable to capture digital images of physical objects for archiving. The receptacle comprises a data transmission apparatus for automatically transferring data with the digital media storage devices and optionally also a power charging apparatus. | 11-13-2008 |
20080288666 | Embedded System Development Platform - A modular development platform is described which enables creation of reliable, compact, physically robust and power efficient embedded device prototypes. The platform consists of a base module which holds the processor and one or more peripheral modules each having a peripheral device and an interface element. The modules can be electrically and physically connected together. The base module communicates with peripheral modules using packets of data with an addressing portion which identifies the peripheral module that is the intended recipient of the data packet. | 11-20-2008 |
20090002391 | Manipulation of Graphical Objects - Methods of manipulating graphical objects are described. One or more graphical objects are displayed in a fixed orientation with reference to a sensed reference direction. Manipulation is achieved by fixing the orientation or position of a displayed graphical object with reference to an apparatus, such as the display itself or a proxy device, detecting a change in orientation of that apparatus and editing the orientation of the graphical object based on the detected change. | 01-01-2009 |
20090128499 | Fingertip Detection for Camera Based Multi-Touch Systems - Touch detection systems and methods are described. The system comprises a light guiding sheet, a light source, a reflective layer and a detector. When a fingertip or other suitable object is pressed against the light guiding sheet, light which is undergoing total internal reflection within the sheet is scattered. The scattered light is reflected by the reflective layer and detected by the detector. In an embodiment, the light is infra-red light. The touch detection system may, in some embodiments, be placed on a display and the touch events used to control the display. | 05-21-2009 |
20090135751 | Low Power Operation of Networked Devices - Methods of reducing power consumption of networked devices are described. When a main processor and associated hardware in a computing device is powered down, a processing element, with lower power consumption than the main processor, performs networking functions on behalf of the main processor. The processing element monitors events and wakes the main processor when defined criteria are satisfied. In an embodiment, these network functions may be to maintain existing network connections and/or establish new network connections and the defined criteria may relate to messages received by the device which are analyzed by the processing element running the application layer code and these criteria may be configurable by a user of the device. | 05-28-2009 |
20090139778 | User Input Using Proximity Sensing - A device is described which enables users to interact with software running on the device through gestures made in an area adjacent to the device. In an embodiment, a portable computing device has proximity sensors arranged on an area of its surface which is not a display, such as on the sides of the device. These proximity sensors define an area of interaction adjacent to the device. User gestures in this area of interaction are detected by creating sensing images from data received from each of the sensors and then analyzing sequences of these images to detect gestures. The detected gestures may be mapped to particular inputs to a software program running on the device and therefore a user can control the operation of the program through gestures. | 06-04-2009 |
20090184921 | Input Through Sensing of User-Applied Forces - Methods and devices for providing a user input to a device through sensing of user-applied forces are described. A user applies forces to a rigid body as if to deform it and these applied forces are detected by force sensors in or on the rigid body. The resultant force on the rigid body is determined from the sensor data and this resultant force is used to identify a user input. In an embodiment, the user input may be a user input to a software program running on the device. In an embodiment the rigid body is the rigid case of a computing device which includes a display and which is running the software program. | 07-23-2009 |
20090190914 | Triggering Data Capture Based on Pointing Direction - Methods and apparatus for triggering directional data capture based on pointing direction are described. In an embodiment, the data captured is an image and a camera is described which includes a sensor for detecting the direction in which the image sensor of the camera is pointing. When the sensed pointing direction is one in which a worthwhile image is likely to be taken, the camera is triggered to capture a new image. The determination of when to capture a new image uses a metric based on the sensed direction and one or more specified trigger conditions. | 07-30-2009 |
20090195402 | Unique Identification of Devices Using Color Detection - Methods and apparatus for uniquely identifying wireless devices in close physical proximity are described. When two wireless devices are brought into close proximity, one of the devices displays an optical indicator, such as a light pattern. This device then sends messages to other devices which are within wireless range to cause them to use any light sensor to detect a signal. In an embodiment, the light sensor is a camera and the detected signal is an image captured by the camera. Each device then sends data identifying what was detected back to the device displaying the pattern. By analyzing this data, the first device can determine which other device detected the indicator that it displayed and therefore determine that this device is in close physical proximity to it. In an example, the first device is an interactive surface arranged to identify the wireless addresses of devices which are placed on the surface. | 08-06-2009 |
20090219253 | Interactive Surface Computer with Switchable Diffuser - An interactive surface computer with a switchable diffuser layer is described. The switchable layer has two states: a transparent state and a diffusing state. When it is in its diffusing state, a digital image is displayed and when the layer is in its transparent state, an image can be captured through the layer. In an embodiment, a projector is used to project the digital image onto the layer in its diffusing state and optical sensors are used for touch detection. | 09-03-2009 |
20090220093 | Distribution Of Keys For Encryption/Decryption - Methods of encryption and decryption are described which use a key associated with an event to encrypt/decrypt data associated with the event. The method of encryption comprises identifying a key associated with an event and encrypting data using the identified key. The encrypted data is then published along with details of the event. | 09-03-2009 |
20100023788 | Reducing Power Consumption by Offloading Applications - Methods of reducing power consumption in a computing device are described in which file sharing applications which are running in the background are offloaded onto a lower power subsystem and the rest of the computing device can be put into a low power state. The lower power subsystem runs application stubs which autonomously execute a subset of the operations performed by a file sharing application which was previously running on the computing device. Before the rest of the computing device goes into the low power state, application state information is passed to the lower power subsystem for use by the application stubs. In an example, the application stub may continue to download files whilst the rest of the computing device is in standby or is shutdown and the application state information may include details of the files that are to be downloaded. | 01-28-2010 |
20100149090 | GESTURES, INTERACTIONS, AND COMMON GROUND IN A SURFACE COMPUTING ENVIRONMENT - Aspects relate to detecting gestures that relate to a desired action, wherein the detected gestures are common across users and/or devices within a surface computing environment. Inferred intentions and goals based on context, history, affordances, and objects are employed to interpret gestures. Where there is uncertainty in intention of the gestures for a single device or across multiple devices, independent or coordinated communication of uncertainty or engagement of users through signaling and/or information gathering can occur. | 06-17-2010 |
20100149182 | Volumetric Display System Enabling User Interaction - A volumetric display system which enables user interaction is described. In an embodiment, the system consists of a volumetric display and an optical system. The volumetric display creates a 3D light field of an object to be displayed and the optical system creates a copy of the 3D light field in a position away from the volumetric display and where a user can interact with the image of the object displayed. In an embodiment, the optical system involves a pair of parabolic mirror portions. | 06-17-2010 |
20100182220 | SURFACE PUCK - An image orientation system is provided wherein images (rays of lights) are projected to a user based on the user's field of view or viewing angle. As the rays of light are projected, streams of air can be produced that bend or focus the rays of light toward the user's field of view. The streams of air can be cold air, hot air, or combinations thereof. Further, an image receiver can be utilized to receive the produced image/rays of light directly in line with the user's field of view. The image receiver can be a wearable device, such as a head mounted display. | 07-22-2010 |
20100218249 | AUTHENTICATION VIA A DEVICE - The claimed subject matter provides a system and/or a method that facilitates authentication of a user in a surface computing environment. A device or authentication object can be carried by a user and employed to retain authentication information. An authentication component can obtain the authentication information from the device and analyze the information to verify an identity of the user. A touch input component can ascertain if a touch input is authentication by associating touch input with the user. In addition, authentication information can be employed to establish a secure communications channel for transfer of user data. | 08-26-2010 |
20100225595 | TOUCH DISCRIMINATION - The claimed subject matter provides a system and/or a method that facilitates distinguishing input among one or more users in a surface computing environment. A variety of information can be obtained and analyzed to infer an association between a particular input and a particular user. Touch point information can be acquired from a surface wherein the touch point information relates to a touch point. In addition, one or more environmental sensors can monitor the surface computing environment and provide environmental information. The touch point information and the environmental information can be analyzed to determine direction of inputs, location of users, and movement of users and so on. Individual analysis results can be correlated and/or aggregated to generate a inference of association between a touch point and user. | 09-09-2010 |
20100268831 | Thin Client Session Management - Thin client session management is described. In embodiments, a thin client device senses a usage context for the thin client device, and a process analyses the usage context to automatically select a session for the thin client device to connect to. Embodiments describe how the sensed usage context can indicate a location of the thin client device, movement of the thin client device, swapping of thin client devices or an identity of a user of the thin client device. Embodiments also describe how the thin client can be automatically authorized to access a selected session, based on the usage context. In other embodiments, a thin client device comprises a sensing device that can indicate a usage context for the thin client. Embodiments describe how the sensing device can determine that the thin client device is located in a docking station, and identify the docking station. | 10-21-2010 |
20100302199 | FERROMAGNETIC USER INTERFACES - Ferromagnetic user interfaces are described. In embodiments, user interface devices are described that can detect the location of movement on a user-touchable portion by sensing movement of a ferromagnetic material. In some embodiments sensors are arranged in a two dimensional array, and the user interface device can determine the location of the movement in a plane substantially parallel to the two-dimensional array and the acceleration of movement substantially perpendicular to the two-dimensional array. In other embodiments, user interface devices are described that can cause a raised surface region to be formed on a ferrofluid layer of a user-touchable portion, which is detectable by the touch of a user. Embodiments describe how the raised surface region can be moved on the ferrofluid layer. Embodiments also describe how the raised surface region can be caused to vibrate. | 12-02-2010 |
20100315335 | Pointing Device with Independently Movable Portions - A pointing device with independently movable portions is described. In an embodiment, a pointing device comprises a base unit and a satellite portion. The base unit is arranged to be located under a palm of a user's hand and be movable over a supporting surface. The satellite portion is arranged to be located under a digit of the user's hand and be independently movable over the supporting surface relative to the base unit. In embodiments, data from at least one sensing device is read, and movement of both the base unit and the independently movable satellite portion of the pointing device is calculated from the data. The movement of the base unit and the satellite portion is analyzed to detect a user gesture. | 12-16-2010 |
20100315336 | Pointing Device Using Proximity Sensing - A pointing device using proximity sensing is described. In an embodiment, a pointing device comprises a movement sensor and a proximity sensor. The movement sensor generates a first data sequence relating to sensed movement of the pointing device relative to a surface. The proximity sensor generates a second data sequence relating to sensed movement relative to the pointing device of one or more objects in proximity to the pointing device. In embodiments, data from the movement sensor of the pointing device is read and the movement of the pointing device relative to the surface is determined. Data from the proximity sensor is also read, and a sequence of sensor images of one or more objects in proximity to the pointing device are generated. The sensor images are analyzed to determine the movement of the one or more objects relative to the pointing device. | 12-16-2010 |
20100315413 | Surface Computer User Interaction - Surface computer user interaction is described. In an embodiment, an image of a user's hand interacting with a user interface displayed on a surface layer of a surface computing device is captured. The image is used to render a corresponding representation of the hand. The representation is displayed in the user interface such that the representation is geometrically aligned with the user's hand. In embodiments, the representation is a representation of a shadow or a reflection. The process is performed in real-time, such that movement of the hand causes the representation to correspondingly move. In some embodiments, a separation distance between the hand and the surface is determined and used to control the display of an object rendered in a 3D environment on the surface layer. In some embodiments, at least one parameter relating to the appearance of the object is modified in dependence on the separation distance. | 12-16-2010 |
20110016333 | Power Transfer Between Devices - Power transfer between devices such as laptop computers, mobile phones, personal digital assistants, media players and other devices is described. In an embodiment power transfer is achieved either from a power source at a device or to a power source at that device using a bidirectional power transfer connector. In some embodiments a power management module at the device uses context, models or other information to control factors such as the power transfer direction, duration and amount. In examples, user preferences are taken into account. In an example, the bidirectional power transfer connector is provided as a USB connection or a wireless power transfer connection. | 01-20-2011 |
20110121950 | UNIQUE IDENTIFICATION OF DEVICES USING COLOR DETECTION - Methods and apparatus for uniquely identifying wireless devices in close physical proximity are described. When two wireless devices are brought into close proximity, one of the devices displays an optical indicator, such as a light pattern. This device then sends messages to other devices which are within wireless range to cause them to use any light sensor to detect a signal. In an embodiment, the light sensor is a camera and the detected signal is an image captured by the camera. Each device then sends data identifying what was detected back to the device displaying the pattern. By analyzing this data, the first device can determine which other device detected the indicator that it displayed and therefore determine that this device is within close physical proximity. In an example, the first device is an interactive surface arranged to identify the wireless addresses of devices which are placed on the surface. | 05-26-2011 |
20120032982 | Manipulation of Graphical Objects - Methods of manipulating graphical objects are described. One or more graphical objects are displayed in a fixed orientation with reference to a sensed reference direction. Manipulation is achieved by fixing the orientation or position of a displayed graphical object with reference to an apparatus, such as the display itself or a proxy device, detecting a change in orientation of that apparatus and editing the orientation of the graphical object based on the detected change. | 02-09-2012 |
20120198103 | EMBEDDED SYSTEM DEVELOPMENT PLATFORM - A modular development platform is described which enables creation of reliable, compact, physically robust and power efficient embedded device prototypes. The platform consists of a base module which holds a processor and one or more peripheral modules each having an interface element. The base module and the peripheral modules may be electrically and/or physically connected together. The base module communicates with peripheral modules using packets of data with an addressing portion which identifies the peripheral module that is the intended recipient of the data packet. | 08-02-2012 |
20120242609 | Interacting With Physical and Digital Objects Via a Multi-Touch Device - Existing tools for organizing family memories offer few possibilities for easily integrating both physical and digital materials in order to produce a single archive for a family (or other group of users). This also applies to archiving of physical objects and digital media in general (even for applications outside the field of family use). An archiving system is described which incorporates at least one image capture device, a display, a sensing apparatus arranged to detect user input associated with the display, a processor and memory, and a receptacle for holding digital media storage devices such as mobile telephones, digital cameras, personal digital assistants and the like. The image capture device is operable to capture digital images of physical objects for archiving. The receptacle comprises a data transmission apparatus for automatically transferring data with the digital media storage devices and optionally also a power charging apparatus. | 09-27-2012 |
20120306734 | Gesture Recognition Techniques - In one or more implementations, a static geometry model is generated, from one or more images of a physical environment captured using a camera, using one or more static objects to model corresponding one or more objects in the physical environment. Interaction of a dynamic object with at least one of the static objects is identified by analyzing at least one image and a gesture is recognized from the identified interaction of the dynamic object with the at least one of the static objects to initiate an operation of the computing device. | 12-06-2012 |
20130082818 | Motion Triggered Data Transfer - Methods of controlling the transfer of data between devices are described in which the manner of control is determined by a movement experienced by at least one of the devices. The method involves detecting a triggering movement and determining a characteristic of this movement. The transfer of data is then controlled based on the characteristic which has been identified. | 04-04-2013 |
20130169687 | Manipulation of Graphical Objects - One or more graphical objects are displayed in a fixed orientation with reference to a sensed reference direction. Manipulation is achieved by fixing the orientation or position of a displayed graphical object with reference to an apparatus, such as the display itself or a proxy device, detecting a change in orientation of that apparatus and editing the orientation of the graphical object based on the detected change. | 07-04-2013 |
20130234970 | USER INPUT USING PROXIMITY SENSING - A device is described which enables users to interact with software running on the device through gestures made in an area adjacent to the device. In an embodiment, a portable computing device has proximity sensors arranged on an area of its surface which is not a display, such as on the sides of the device. These proximity sensors define an area of interaction adjacent to the device. User gestures in this area of interaction are detected by creating sensing images from data received from each of the sensors and then analyzing sequences of these images to detect gestures. The detected gestures may be mapped to particular inputs to a software program running on the device and therefore a user can control the operation of the program through gestures. | 09-12-2013 |
20130234992 | Touch Discrimination - In some implementations, a touch point on a surface of a touchscreen device may be determined. An image of a region of space above the surface and surrounding the touch point may be determined The image may include a brightness gradient that captures a brightness of objects above the surface. A binary image that includes one or more binary blobs may be created based on a brightness of portions of the image. A determination may be made as to which of the one more binary blobs are connected to each other to form portions of a particular user. A determination may be made that the particular user generated the touch point. | 09-12-2013 |
20130241806 | Surface Puck - An image orientation system is provided wherein images (rays of lights) are projected to a user based on the user's field of view or viewing angle. As the rays of light are projected, streams of air can be produced that bend or focus the rays of light toward the user's field of view. The streams of air can be cold air, hot air, or combinations thereof. Further, an image receiver can be utilized to receive the produced image/rays of light directly in line with the user's field of view. The image receiver can be a wearable device, such as a head mounted display. | 09-19-2013 |
20140206451 | RECONFIGURABLE CLIP-ON MODULES FOR MOBILE COMPUTING DEVICES - A set of reconfigurable clip-on modules for mobile computing devices includes two or more modules and at least one of the modules has an input button or other control and at least one of the modules can communicate with the computing device without needing to be connected to it via a wire. The input button is mapped to a user input in a program, such as a game, which is running or displayed on the computing device to which the modules are clipped. In an embodiment, user inputs via the buttons or other controls on the clip-on modules are mapped to user inputs in a game running on the device, which may be a touch-screen device, and the mapping between user inputs via the buttons and user inputs in the game may change dependent upon the game being played, user preference, or other criteria. | 07-24-2014 |
20140208274 | CONTROLLING A COMPUTING-BASED DEVICE USING HAND GESTURES - Methods and system for controlling a computing-based device using both input received from a traditional input device (e.g. keyboard) and hand gestures made on or near a reference object (e.g. keyboard). In some examples, the hand gestures may comprise one or more hand touch gestures and/or one or more hand air gestures. | 07-24-2014 |
20140241540 | WEARABLE AUDIO ACCESSORIES FOR COMPUTING DEVICES - Wearable audio accessories for computing devices are described. In one embodiment the wearable audio accessory provides a speech based interface between the user and a nearby computing device for the performance of user-initiated or computing device initiated microtasks. Information is provided to the user via a loudspeaker and the user can provide input via a microphone. An audio sensing channel within the accessory continuously monitors the audio signal as detected by the microphone and in various embodiments will trigger more complex audio processing based on this monitoring. A wireless communication link is provided between the accessory and the nearby computing device. To mitigate any delay caused by the switching between audio processing techniques, the audio accessory may include a rolling buffer which continuously stores the audio signal and outputs a delayed audio signal to the audio processing engines. | 08-28-2014 |
20140244715 | INTERACTION BETWEEN DEVICES DISPLAYING APPLICATION STATUS INFORMATION - Methods and apparatus for displaying dynamic status information on a plurality of devices and enabling interactions between these devices are described. In an embodiment, a trigger signal is sent to one or more computing devices to trigger the launch of an application client on the computing device. The trigger signal is generated on another device in response to a user interacting with the displayed status information. This other device may be an impoverished device which displays status information for an application but is not capable of running the application client. In various embodiments, the status information is displayed in the form of a GUI element called a tile and this status information may be pushed to the device by a proxy server. The trigger signal may be sent to multiple devices or in some embodiments, a computing device may be selected to receive the trigger signal. | 08-28-2014 |
20140247212 | Gesture Recognition Techniques - In one or more implementations, a static geometry model is generated, from one or more images of a physical environment captured using a camera, using one or more static objects to model corresponding one or more objects in the physical environment. Interaction of a dynamic object with at least one of the static objects is identified by analyzing at least one image and a gesture is recognized from the identified interaction of the dynamic object with the at least one of the static objects to initiate an operation of the computing device. | 09-04-2014 |
20140321651 | DISTRIBUTION OF KEYS FOR ENCRYPTION/DECRYPTION - Methods of encryption and decryption are described which use a key associated with an event to encrypt/decrypt data associated with the event. The method of encryption comprises identifying a key associated with an event and encrypting data using the identified key. The encrypted data is then published along with details of the event. | 10-30-2014 |
20140327637 | INPUT THROUGH SENSING OF USER-APPLIED FORCES - Methods and devices for providing a user input to a device through sensing of user-applied forces are described. A user applies forces to a rigid body as if to deform it and these applied forces are detected by force sensors in or on the rigid body. The resultant force on the rigid body is determined from the sensor data and this resultant force is used to identify a user input. In an embodiment, the user input may be a user input to a software program running on the device. In an embodiment the rigid body is the rigid case of a computing device which includes a display and which is running the software program. | 11-06-2014 |
20150033167 | RECONFIGURABLE CLIP-ON MODULES FOR MOBILE COMPUTING DEVICES - A set of reconfigurable clip-on modules for mobile computing devices includes two or more modules and at least one of the modules has an input button or other control and at least one of the modules can communicate with the computing device without needing to be connected to it via a wire. The input button is mapped to a user input in a program, such as a game, which is running or displayed on the computing device to which the modules are clipped. In an embodiment, user inputs via the buttons or other controls on the clip-on modules are mapped to user inputs in a game running on the device, which may be a touch-screen device, and the mapping between user inputs via the buttons and user inputs in the game may change dependent upon the game being played, user preference, or other criteria. | 01-29-2015 |
20150057784 | OPTIMIZING 3D PRINTING USING SEGMENTATION OR AGGREGATION - 3D printing may be optimized by segmenting input jobs and/or combining parts of input jobs together. In an embodiment, a user-defined metric is received associated with each input job and this is used in scheduling input jobs to optimize latency and/or throughput of the 3D printing process, along with the printing envelope and other characteristics of the 3D printers used. In various embodiments, the scheduling may comprise dividing a 3D object into a number of parts and then scheduling these parts separately and/or combining 3D objects, or parts of 3D objects, from various input jobs to be printed at the same time on the same 3D printer. In various embodiments, the scheduling is repeated when a new input job is received and changes made during printing. In various embodiments, a user may submit an updated version of an input job which is already in the process of being printed. | 02-26-2015 |
20150084900 | REMOVABLE INPUT MODULE - A removable input module for a touch-screen device is described. The input module comprises an attachment mechanism to attach the module to the touch-screen device, one or more input controls and an accelerometer and/or magnetometer. The accelerometer and/or magnetometer are configured to provide signals to be used to determine the orientation of the input module relative to the touch-screen device and/or to another input module which is attached to the same touch-screen device. In an embodiment, the input module comprises a processor arranged to analyze the output of the accelerometer and/or magnetometer and determine the orientation of the input module. | 03-26-2015 |