Patent application number | Description | Published |
20080229206 | AUDIBLY ANNOUNCING USER INTERFACE ELEMENTS - Systems, apparatus, methods and computer program products are described below for using surround sound to audibly describe the user interface elements of a graphical user interface. The position of each audible description is based on the position user interface element in the graphical user interface. A method is provided that includes identifying one or more user interface elements that have a position within a display space. Each identified user interface element is described in surround sound, where the sound of each description is positioned based on the position of each respective user interface element relative to the display space. | 09-18-2008 |
20080303645 | Braille Support - Methods and apparatuses to provide improved Braille support are described herein. A connection to a Braille device is received, and a Braille caption panel that includes a Braille code is displayed to simulate an output to the Braille device. The Braille caption panel can include a text translated to the Braille code. The Braille caption panel can include a control element. An accessibility service can be automatically launched to provide the output to the Braille device. | 12-11-2008 |
20090307266 | Processing a page - A method includes generating, for a page comprising a plurality of elements, a contextual grouping of at least one of the plurality of elements based on an object model of the page. A method includes generating a user interface for a non-sighted user based on a page, the user interface comprising at least one contextual grouping generated based on an object model of the page. A method includes identifying a page to be analyzed, the page based on a document object model (DOM) and having a plurality of elements configured to be visually arranged when the page is generated for display. The method includes processing the identified page based on the DOM to include each of the plurality of elements in at least one contextual group associated with the page. The method includes facilitating navigation of the page by a non-sighted user using the at least one contextual group. | 12-10-2009 |
20100185982 | Rendering Icons Along A Multidimensional Path Having A Terminus Position - Icons are arranged in foreground background positions in an interface environment to define a multidimensional path extending from a terminus. The icons transition between the foreground position and the background positions along the multidimensional path. Each icon corresponds to a content-specific menu item. | 07-22-2010 |
20100199215 | METHOD OF PRESENTING A WEB PAGE FOR ACCESSIBILITY BROWSING - A method of presenting a web page is described which incorporates navigation techniques and tools to allow impaired users to navigate throughout a web page in a convenient and geographically intuitive manner. Elements are sampled for in a region located in a user-specified direction, and a UI tool is presented for a detected element. Elements that are “hit” during sampling may be tested for materiality, and the material element with precedence will become the detected element. | 08-05-2010 |
20100235792 | Content Abstraction Presentation Along A Multidimensional Path - Content abstractions are emerged in to an ingress terminus of a multidimensional path and depth transitioned through the multidimensional path to an egress terminus. The content abstractions are eliminated at the egress terminus. | 09-16-2010 |
20100309147 | Devices, Methods, and Graphical User Interfaces for Accessibility Using a Touch-Sensitive Surface - An accessibility method is performed by an electronic device with a display and a touch-sensitive surface. The method includes: displaying a plurality of user interface elements on the display; in response to detecting a first user interface navigation gesture by a finger on the touch-sensitive surface, navigating in the plurality of user interface elements in accordance with a current navigable unit type; in response to detecting a first user interface navigation setting gesture on the touch-sensitive surface: changing the current navigable unit type from the first navigable unit type to a second navigable unit type; and outputting accessibility information about the second navigable unit type; after changing the current navigable unit type, in response to detecting a second user interface navigation gesture by the finger on the touch-sensitive surface, navigating in the plurality of user interface elements in accordance with the current navigable unit type. | 12-09-2010 |
20100309148 | Devices, Methods, and Graphical User Interfaces for Accessibility Using a Touch-Sensitive Surface - An accessibility method is performed by an electronic device with a display and a touch-sensitive surface. The method includes: mapping at least a first portion of the display to the touch-sensitive surface; concurrently displaying a plurality of user interface containers on the display; detecting a user interface container selection event that selects a first user interface container in the plurality of user interface containers; and, in response to detecting the user interface container selection event: ceasing to map the first portion of the display to the touch-sensitive surface, and proportionally mapping the first user interface container to be substantially coextensive with the touch-sensitive surface. | 12-09-2010 |
20110111376 | Braille Mirroring - Techniques for performing Braille mirroring are disclosed. In one aspect, content is converted into Braille content, and then formatted for each of a plurality of Braille displays. The formatted content is then sent to each of the Braille displays. In another aspect, data identifying a primary Braille display in a plurality of Braille displays is stored. Input requesting that a data processing apparatus perform an action is received from one of the Braille displays, and it is determined whether to perform the action, based in part on whether the requesting Braille display is the primary Braille display. | 05-12-2011 |
20110154394 | User Interface With Menu Abstractions And Content Abstractions - Media menu items are generated within a media interface environment. Media menu item abstractions are generated, one of the media menu item abstractions arranged in a foreground position, and one or more of the media menu item abstractions arranged in background positions in the media interface environment. Selection of a media menu items transitions to a corresponding content menu interface. | 06-23-2011 |
20110179388 | Techniques And Systems For Enhancing Touch Screen Device Accessibility Through Virtual Containers And Virtually Enlarged Boundaries - Techniques for increasing accessibility of touch-screen devices are disclosed. In one aspect, container regions on a touch-sensitive user interface of a touch screen device are defined. A touch event corresponding to a location on the user interface is received, and it is determined that the location corresponds to a particular container region. When another touch event is received, content is determined according to a context of the particular container region. The content is then presented. In another aspect, data specifying locations of user interface items on a user interface is received. The data is modified to enlarge an area for a particular item. A touch input event corresponding to a particular location on the user interface is received. It is determined that the location is within the enlarged area for the item, and input is provided to an application indicating that the item was selected. | 07-21-2011 |
20110214056 | Accessory Protocol For Touch Screen Device Accessibility - Techniques for controlling a touch input device using an accessory communicatively coupled to the device are disclosed. In one aspect, an accessibility framework is launched on the device. An accessory coupled to the device is detected. Receipt of input from the accessory is enabled. An accessibility packet is received from the accessory. The accessibility packet includes an accessibility command and one or more parameters. The accessibility packet is processed to extract the first accessibility command and the one or more parameters. Input is generated for the accessibility framework based on the accessibility command and the one or more parameters. In some implementations, the device also sends accessibility commands to the accessory, either in response to accessibility commands received from the accessory or independent of any received accessibility commands. | 09-01-2011 |
20120116778 | Assisted Media Presentation - A system and method is disclosed that uses screen reader like functionality to speak information presented on a graphical user interface displayed by a media presentation system, including information that is not navigable by a remote control device. Information can be spoken in an order that follows a relative importance of the information based on a characteristic of the information or the location of the information within the graphical user interface. A history of previously spoken information is monitored to avoid speaking information more than once for a given graphical user interface. A different pitch can be used to speak information based on a characteristic of the information. Information that is not navigable by the remote control device can be spoken after time delay. Voice prompts can be provided for a remote-driven virtual keyboard displayed by the media presentation system. The voice prompts can be spoken with different voice pitches. | 05-10-2012 |
20120306632 | Custom Vibration Patterns - The present disclosure describes technology, which can be implemented as a method, apparatus, and/or computer software embodied in a computer-readable medium, and which, among other things, be used to create custom vibration patterns in response to user input, for example, in response to the user tapping out a desired pattern on the display of a mobile device. For example, one or more aspects of the subject matter described in this disclosure can be embodied in one or more methods that include receiving tactile input from a user of an electronic device specifying a custom vibration pattern, in concert with receiving tactile input, providing visual feedback to the user corresponding to the received tactile input, and storing the specified custom vibration pattern for use by the electronic device to actuate haptic feedback signaling a predetermined notification event. | 12-06-2012 |
20120306748 | Devices, Methods, and Graphical User Interfaces for Providing Control of a Touch-Based User Interface Absent Physical Touch Capabilities - An electronic device with a display and a touch-sensitive surface displays, on the display, a first visual indicator that corresponds to a virtual touch. The device receives a first input from an adaptive input device. In response to receiving the first input from the adaptive input device, the device displays a first menu on the display. The first menu includes a virtual touches selection icon. In response to detecting selection of the virtual touches selection icon, a menu of virtual multitouch contacts is displayed. | 12-06-2012 |
20130229377 | ACCESSORY PROTOCOL FOR TOUCH SCREEN DEVICE ACCESSIBILITY - Techniques for controlling a touch input device using an accessory communicatively coupled to the device are disclosed. In one aspect, an accessibility framework is launched on the device. An accessory coupled to the device is detected. Receipt of input from the accessory is enabled. An accessibility packet is received from the accessory. The accessibility packet includes an accessibility command and one or more parameters. The accessibility packet is processed to extract the first accessibility command and the one or more parameters. Input is generated for the accessibility framework based on the accessibility command and the one or more parameters. In some implementations, the device also sends accessibility commands to the accessory, either in response to accessibility commands received from the accessory or independent of any received accessibility commands. | 09-05-2013 |
20130311921 | Devices, Methods, and Graphical User Interfaces for Accessibility Using a Touch-Sensitive Surface - An accessibility method is performed by an electronic device with a display and a touch-sensitive surface. The method includes: displaying a plurality of user interface elements on the display; in response to detecting a first user interface navigation gesture by a finger on the touch-sensitive surface, navigating in the plurality of user interface elements in accordance with a current navigable unit type; in response to detecting a first user interface navigation setting gesture on the touch-sensitive surface: changing the current navigable unit type from the first navigable unit type to a second navigable unit type; and outputting accessibility information about the second navigable unit type; after changing the current navigable unit type, in response to detecting a second user interface navigation gesture by the finger on the touch-sensitive surface, navigating in the plurality of user interface elements in accordance with the current navigable unit type. | 11-21-2013 |
20130329924 | REMOTELY CONTROLLING A HEARING DEVICE - Systems, methods, and non-transitory computer-readable storage media are provided for remotely controlling a hearing device. A hearing device configured to communicate with a control device transmits status data, including settings, to the control device. The control device displays the status data in an interface configured to receive input specifying new settings, upon which a command is sent to the hearing device to change the current setting. The control device can automatically change the settings based on a determined current environment to be a stored program optimized to the current environment. The current environment can be determined based on the location of the hearing device or another device connected to the control device. Quick mode allows settings to be viewed and changed quickly by displaying multiple related settings as one and overriding interface buttons. Remote listen mode receives audio data from a microphone and transmit it to the hearing device. | 12-12-2013 |
20140108998 | MULTIMEDIA CONTROL CENTER - Techniques and systems for centralized access to multimedia content stored on or available to a computing device are disclosed. The centralized access can be provided by a media control interface that receives user inputs and interacts with media programs resident on the computing device to produce graphical user interfaces that can be presented on a display device. | 04-17-2014 |