Patent application number | Description | Published |
20090210372 | RULE-BASED PROGRAMMING LANGUAGES FOR ENTITIES IN ENVIRONMENTS - A rule-based programming language may be devised for programming an entity in an environment. Computer systems may therefore be configured to program the entity with at least one rule comprising at least zero language conditions representing an action condition, at least one language verb representing an action, and at least zero language verb parameters representing an action object. The computer system may also be configured to facilitate entry by a user of a rule set according to the rule-based programming language by receiving user selections of language conditions, language verbs, and language verb parameters. By facilitating the programming by users of entities within environments, the computer system may facilitate individuals (such as non-technical individuals, aspiring programmers, and children) in understanding programming concepts, encourage the development of experience with computer programming, and permit the generation of useful computer programs by non-proficient programmers. | 08-20-2009 |
20100245106 | Mobile Computer Device Binding Feedback - Embodiments of mobile computer device binding feedback are described. In embodiments, an application interface for a device application is displayed on a first display that is integrated in a first housing of a dual-display mobile computer device. The application interface can also be displayed on a second display that is integrated in a second housing of the dual-display mobile computer device. Binding position data is received that is associated with a binding system that movably connects the first housing and the second housing. Application context data that is associated with the device application is also received. Feedback can then be generated that correlates to the binding position data and to the application context data. | 09-30-2010 |
20100245209 | MOBILE COMPUTER DEVICE DISPLAY POSTURES - Embodiments of mobile computer device display postures are described. In embodiments, a first display is integrated in a first housing of a dual-display mobile computer device, and a second display is integrated in a second housing of the dual-display mobile computer device. Position data can be sensed from a binding that movably connects the first housing and the second housing, and a position angle can be determined between the first housing and the second housing that correlates to a display posture of the first display and the second display. | 09-30-2010 |
20110063192 | MOBILE COMPUTER DEVICE BINDING FEEDBACK - In embodiments of mobile computer device binding feedback, an application interface for a device application is displayed on a first display that is integrated in a dual-display mobile device. The application interface can also be displayed on a second display that is integrated in the dual-display mobile device. Binding position data is received from a binding system that movably couples the first display to the second display. Application context data that is associated with the device application is also received. Feedback can then be generated based on the binding position data and the application context data, where the feedback can be generated as audio feedback, video feedback, display feedback, and/or haptic feedback. | 03-17-2011 |
20110304649 | CHARACTER SELECTION - Character selection techniques are described. In implementations, a list of characters is output for display in a user interface by a computing device. An input is recognized, by the computing device, that was detected using a camera as a gesture to select at least one of the characters. | 12-15-2011 |
20120042246 | CONTENT GESTURES - Content gestures are described. In implementations, one or more controls are output to control output of content and for display in a user interface by a computing device. An input is recognized, by the computing device, which was detected using a camera as a gesture to interact with a particular one of the controls to control the output of the content. | 02-16-2012 |
20130127738 | DYNAMIC SCALING OF TOUCH SENSOR - Embodiments are disclosed that relate to dynamically scaling a mapping between a touch sensor and a display screen. One disclosed embodiment provides a method including setting a first user interface mapping that maps an area of the touch sensor to a first area of the display screen, receiving a user input from the user input device that changes a user interaction context of the user interface, and in response to the user input, setting a second user interface mapping that maps the area of the touch sensor to a second area of the display screen. The method further comprises providing to the display device an output of a user interface image representing the user input at a location based on the second user interface mapping. | 05-23-2013 |
20130328775 | User Interface Elements Positioned for Display - User interface elements positioned for display is described. In various embodiment(s), sensor input can be received from one or more sensors that are integrated with a portable device. A device hold position that corresponds to where the portable device is grasped by a user can be determined based at least in part on the sensor input. A display of user interface element(s) can then be initiated for display on an integrated display of the portable device based on the device hold position that corresponds to where the portable device is grasped. | 12-12-2013 |