Patent application number | Description | Published |
20090215471 | LOCATION BASED OBJECT TRACKING - A user of a mobile device is able to display information about objects in the surrounding environment and to optionally interact with those objects. The information may be displayed as a graphical overlay on top of a real-time display of imagery from a camera in the mobile device with the overlay indexed to the real-time display. The graphical overlay may include positional information about an external object and may include navigational information intended to assist the user in moving to the object's location. There may also be a graphical user interface which allows the user to utilize the mobile device to interact with an external object. | 08-27-2009 |
20100107219 | AUTHENTICATION - CIRCLES OF TRUST - Within a surface computing environment users are provided a seamless and intuitive manner of modifying security levels associated with information. If a modification is to be made the user can perceive the modifications and the result of such modifications, such as on a display. When information is rendered within the surface computing environment and a condition changes, the user can quickly have that information concealed in order to mitigate unauthorized access to the information. | 04-29-2010 |
20100149090 | GESTURES, INTERACTIONS, AND COMMON GROUND IN A SURFACE COMPUTING ENVIRONMENT - Aspects relate to detecting gestures that relate to a desired action, wherein the detected gestures are common across users and/or devices within a surface computing environment. Inferred intentions and goals based on context, history, affordances, and objects are employed to interpret gestures. Where there is uncertainty in intention of the gestures for a single device or across multiple devices, independent or coordinated communication of uncertainty or engagement of users through signaling and/or information gathering can occur. | 06-17-2010 |
20100218249 | AUTHENTICATION VIA A DEVICE - The claimed subject matter provides a system and/or a method that facilitates authentication of a user in a surface computing environment. A device or authentication object can be carried by a user and employed to retain authentication information. An authentication component can obtain the authentication information from the device and analyze the information to verify an identity of the user. A touch input component can ascertain if a touch input is authentication by associating touch input with the user. In addition, authentication information can be employed to establish a secure communications channel for transfer of user data. | 08-26-2010 |
20100225595 | TOUCH DISCRIMINATION - The claimed subject matter provides a system and/or a method that facilitates distinguishing input among one or more users in a surface computing environment. A variety of information can be obtained and analyzed to infer an association between a particular input and a particular user. Touch point information can be acquired from a surface wherein the touch point information relates to a touch point. In addition, one or more environmental sensors can monitor the surface computing environment and provide environmental information. The touch point information and the environmental information can be analyzed to determine direction of inputs, location of users, and movement of users and so on. Individual analysis results can be correlated and/or aggregated to generate a inference of association between a touch point and user. | 09-09-2010 |
20100240390 | Dual Module Portable Devices - A dual module portable device may be provided. A motion of a first module of the dual module portable device may be detected. Based at least in part on the detected motion, a position of the first module may be determined relative to the second module of the portable device. Once the relative position of the first module has been determined, a portion of a user interface associated with the relative position may be displayed at the first module. | 09-23-2010 |
20100241348 | Projected Way-Finding - Navigation information may be provided. First, a destination location may be received at a portable device. Next, a current location of the portable device maybe detected. Then, at least one way-point may be calculated based on the current location and the destination location. An orientation and a level of the portable device may be determined and the at least one way-point may then be projected from the portable device. | 09-23-2010 |
20100241987 | Tear-Drop Way-Finding User Interfaces - A tear-drop way-finding user interface (UI) may be provided. A first UI portion corresponding to a device location may be provided. In addition, an object may be displayed at a first relative position within the first UI portion. Then, upon a detected change in device location, a second UI portion corresponding to the changed device location may be provided. In response to the changed device location, a second relative position of the object may be calculated. Next, a determination may be made as to whether the second relative position of the object is within a displayable range of the second UI portion. If the second relative position of the object is not within the displayable range of the second UI portion, then a tear-drop icon indicative of the second relative position of the object may be displayed at an edge of the second UI portion. | 09-23-2010 |
20100241999 | Canvas Manipulation Using 3D Spatial Gestures - User interface manipulation using three-dimensional (3D) spatial gestures may be provided. A two-dimensional (2D) user interface (UI) representation may be displayed. A first gesture may be performed, and, in response to the first gesture's detection, the 2D UI representation may be converted into a 3D UI representation. A second gesture may then be performed, and, in response to the second gesture's detection, the 3D UI representation may be manipulated. Finally, a third gesture may be performed, and, in response to the third gesture's detection, the 3D UI representation may be converted back into the 2D UI representation. | 09-23-2010 |
20100302144 | CREATING A VIRTUAL MOUSE INPUT DEVICE - A virtual mouse input device is created in response to a placement of a card on a touch surface. When the card is placed on the touch surface, the boundaries of the card are captured and a virtual mouse appears around the card. The virtual mouse may be linked with a user through an identifier that is contained on the card. Other controls and actions may be presented in menus that appear with the virtual mouse. For instance, the user may select the type of input (e.g. mouse, keyboard, ink or trackball) driven by the business card. Once created, the virtual mouse is configured to receive user input until the card is removed from the touch surface. The virtual mouse is configured to move a cursor on a display in response to movement of the card on the touch surface. | 12-02-2010 |
20100302155 | VIRTUAL INPUT DEVICES CREATED BY TOUCH INPUT - An input device is created on a touch screen in response to a user's placement of their hand. When a user places their hand on the touch screen, an input device sized for their hand is dynamically created. Alternatively, some other input device may be created. For example, when the user places two hands on the device a split keyboard input device may be dynamically created on the touch screen that is split between the user's hand locations. Once the input device is determined, the user may enter input through the created device on the input screen. The input devices may be configured for each individual user such that the display of the input device changes based on physical characteristics that are associated with the user. | 12-02-2010 |
20100306649 | VIRTUAL INKING USING GESTURE RECOGNITION - A virtual inking device is created in response to a touch input device detecting a user's inking gesture. For example, when a user places one of their hands in a pen gesture (i.e. by connecting the index finger with the thumb while holding the other fingers near the palm), the user may perform inking operations. When the user changes the pen gesture to an erase gesture (i.e. making a first) then the virtual pen may become a virtual eraser. Other inking gestures may also be utilized. | 12-02-2010 |
20120139939 | Dual Module Portable Devices - A dual module portable device may be provided. A motion of a first module of the dual module portable device may be detected. Based at least in part on the detected motion, a position of the first module may be determined relative to the second module of the portable device. Once the relative position of the first module has been determined, a portion of a user interface associated with the relative position may be displayed at the first module. | 06-07-2012 |
20130234992 | Touch Discrimination - In some implementations, a touch point on a surface of a touchscreen device may be determined. An image of a region of space above the surface and surrounding the touch point may be determined The image may include a brightness gradient that captures a brightness of objects above the surface. A binary image that includes one or more binary blobs may be created based on a brightness of portions of the image. A determination may be made as to which of the one more binary blobs are connected to each other to form portions of a particular user. A determination may be made that the particular user generated the touch point. | 09-12-2013 |
20140344706 | Dual Module Portable Devices - A dual module portable device may be provided. A motion of a first module of the dual module portable device may be detected. Based at least in part on the detected motion, a position of the first module may be determined relative to the second module of the portable device. Once the relative position of the first module has been determined, a portion of a user interface associated with the relative position may be displayed at the first module. | 11-20-2014 |