Patent application number | Description | Published |
20110054880 | External Content Transformation - Techniques and systems for content transformation between devices are disclosed. In one aspect, a system includes a host device that sends content to client devices, and client devices that receive content from the host device in one format and transform the content into a different format. The client devices present the transformed content to users. In another aspect, the host device presents content in a native format, determines that a client device requires the content to be in a different format, converts the content to a reference format, and sends the converted content to the client device. | 03-03-2011 |
20110179388 | Techniques And Systems For Enhancing Touch Screen Device Accessibility Through Virtual Containers And Virtually Enlarged Boundaries - Techniques for increasing accessibility of touch-screen devices are disclosed. In one aspect, container regions on a touch-sensitive user interface of a touch screen device are defined. A touch event corresponding to a location on the user interface is received, and it is determined that the location corresponds to a particular container region. When another touch event is received, content is determined according to a context of the particular container region. The content is then presented. In another aspect, data specifying locations of user interface items on a user interface is received. The data is modified to enlarge an area for a particular item. A touch input event corresponding to a particular location on the user interface is received. It is determined that the location is within the enlarged area for the item, and input is provided to an application indicating that the item was selected. | 07-21-2011 |
20110214056 | Accessory Protocol For Touch Screen Device Accessibility - Techniques for controlling a touch input device using an accessory communicatively coupled to the device are disclosed. In one aspect, an accessibility framework is launched on the device. An accessory coupled to the device is detected. Receipt of input from the accessory is enabled. An accessibility packet is received from the accessory. The accessibility packet includes an accessibility command and one or more parameters. The accessibility packet is processed to extract the first accessibility command and the one or more parameters. Input is generated for the accessibility framework based on the accessibility command and the one or more parameters. In some implementations, the device also sends accessibility commands to the accessory, either in response to accessibility commands received from the accessory or independent of any received accessibility commands. | 09-01-2011 |
20120046947 | Assisted Reader - An electronic reading device for reading ebooks and other digital media items combines a touch surface electronic reading device with accessibility technology to provide a visually impaired user more control over his or her reading experience. In some implementations, the reading device can be configured to operate in at least two modes: a continuous reading mode and an enhanced reading mode. | 02-23-2012 |
20120116778 | Assisted Media Presentation - A system and method is disclosed that uses screen reader like functionality to speak information presented on a graphical user interface displayed by a media presentation system, including information that is not navigable by a remote control device. Information can be spoken in an order that follows a relative importance of the information based on a characteristic of the information or the location of the information within the graphical user interface. A history of previously spoken information is monitored to avoid speaking information more than once for a given graphical user interface. A different pitch can be used to speak information based on a characteristic of the information. Information that is not navigable by the remote control device can be spoken after time delay. Voice prompts can be provided for a remote-driven virtual keyboard displayed by the media presentation system. The voice prompts can be spoken with different voice pitches. | 05-10-2012 |
20130169549 | Devices, Methods, and Graphical User Interfaces for Providing Multitouch Inputs and Hardware-Based Features Using a Single Touch Input - An electronic device with a display and a touch-sensitive surface displays, on the display, a first visual indicator. The electronic device receives a first single touch input on the touch-sensitive surface at a location that corresponds to the first visual indicator; and, in response to detecting the first single touch input on the touch-sensitive surface at a location that corresponds to the first visual indicator, replaces display of the first visual indicator with display of a first menu. The first menu includes a virtual touches selection icon. In response to detecting selection of the virtual touches selection icon, the electronic device displays a menu of virtual multitouch contacts. | 07-04-2013 |
20130172022 | Device, Method, and Graphical User Interface for Configuring and Implementing Restricted Interactions with a User Interface - An electronic device, while in a restricted interaction mode in an application other than a call application, displays a first user interface that includes a plurality of user interface objects, and receives an incoming call. The electronic device determines whether the incoming phone call satisfies predefined signaling criteria. In accordance with a determination that the incoming call satisfies the predefined signaling criteria, the electronic device outputs a signal that indicates the incoming call. In accordance with a determination that the incoming phone call does not satisfy the predefined criteria, the electronic device foregoes outputting the signal indicating the incoming call. | 07-04-2013 |
20130174100 | Device, Method, and Graphical User Interface for Configuring Restricted Interaction with a User Interface - An electronic device, while in an interaction configuration mode: displays a first user interface that includes a plurality of user interface objects; and, while displaying the first user interface, detects one or more gesture inputs on a touch-sensitive surface. For a respective gesture input, the device determines whether one or more user interface objects of the plurality of user interface objects correspond to the respective gesture input. The device visually distinguishes a first set of user interface objects in the plurality of user interface objects that correspond to the detected one or more gesture inputs from a second set of user interface objects in the plurality of user interface objects that do not correspond to the detected one or more gesture inputs. The device detects an input; and, in response to detecting the input, exits the interaction configuration mode and enters a restricted interaction mode. | 07-04-2013 |
20130229377 | ACCESSORY PROTOCOL FOR TOUCH SCREEN DEVICE ACCESSIBILITY - Techniques for controlling a touch input device using an accessory communicatively coupled to the device are disclosed. In one aspect, an accessibility framework is launched on the device. An accessory coupled to the device is detected. Receipt of input from the accessory is enabled. An accessibility packet is received from the accessory. The accessibility packet includes an accessibility command and one or more parameters. The accessibility packet is processed to extract the first accessibility command and the one or more parameters. Input is generated for the accessibility framework based on the accessibility command and the one or more parameters. In some implementations, the device also sends accessibility commands to the accessory, either in response to accessibility commands received from the accessory or independent of any received accessibility commands. | 09-05-2013 |
20130263251 | Device, Method, and Graphical User Interface for Integrating Recognition of Handwriting Gestures with a Screen Reader - While an electronic device with a display and a touch-sensitive surface is in a screen reader accessibility mode, the device displays an application launcher screen including a plurality of application icons. A respective application icon corresponds to a respective application stored in the device. The device detects a sequence of one or more gestures on the touch-sensitive surface that correspond to one or more characters. A respective gesture that corresponds to a respective character is a single finger gesture that moves across the touch-sensitive surface along a respective path that corresponds to the respective character. The device determines whether the detected sequence of one or more gestures corresponds to a respective application icon of the plurality of application icons, and, in response to determining that the detected sequence of one or more gestures corresponds to the respective application icon, performs a predefined operation associated with the respective application icon. | 10-03-2013 |
20140165000 | Device, Method, and Graphical User Interface for Configuring and Implementing Restricted Interactions for Applications - An electronic device, while in an interaction configuration mode for a first application, concurrently displays: a first user interface, one or more interaction control user interface objects, and an application restriction controls display user interface object for the first application. The device detects a first gesture, and in response, displays application restriction control user interface objects for the first application. A respective application restriction control user interface object indicates whether a corresponding feature of the first application is configured to be enabled in a restricted interaction mode. The device detects a second gesture, and changes display of a setting in the first application restriction control user interface object for the first application. The device detects a second input, and in response, enters the restricted interaction mode for the first application. The corresponding feature is restricted in accordance with the setting in the first application restriction control user interface object. | 06-12-2014 |
20140281950 | Device, Method, and Graphical User Interface for Generating Haptic Feedback for User Interface Elements - An electronic device in communication with a haptic feedback device that includes a touch-sensitive surface sends instructions to the haptic display to display a document with multiple characters. A respective character is displayed at a respective character size. While the haptic display is displaying the document, the device receives an input that corresponds to a finger contact at a first location on the haptic display. In response to receiving the input, the device associates a first cursor position with the first location, determines a first character in the plurality of characters adjacent to the first cursor position, and sends instructions to the haptic display to output a Braille character, at the first location, that corresponds to the first character. A respective Braille character is output on the haptic display at a respective Braille character size that is larger than the corresponding displayed character size. | 09-18-2014 |
20150040213 | DEVICE, METHOD, AND GRAPHICAL USER INTERFACE FOR INTEGRATING RECOGNITION OF HANDWRITING GESTURES WITH A SCREEN READER - While an electronic device with a display and a touch-sensitive surface is in a screen reader accessibility mode, the device displays an application launcher screen including a plurality of application icons. A respective application icon corresponds to a respective application stored in the device. The device detects a sequence of one or more gestures on the touch-sensitive surface that correspond to one or more characters. A respective gesture that corresponds to a respective character is a single finger gesture that moves across the touch-sensitive surface along a respective path that corresponds to the respective character. The device determines whether the detected sequence of one or more gestures corresponds to a respective application icon of the plurality of application icons, and, in response to determining that the detected sequence of one or more gestures corresponds to the respective application icon, performs a predefined operation associated with the respective application icon. | 02-05-2015 |
Patent application number | Description | Published |
20120306632 | Custom Vibration Patterns - The present disclosure describes technology, which can be implemented as a method, apparatus, and/or computer software embodied in a computer-readable medium, and which, among other things, be used to create custom vibration patterns in response to user input, for example, in response to the user tapping out a desired pattern on the display of a mobile device. For example, one or more aspects of the subject matter described in this disclosure can be embodied in one or more methods that include receiving tactile input from a user of an electronic device specifying a custom vibration pattern, in concert with receiving tactile input, providing visual feedback to the user corresponding to the received tactile input, and storing the specified custom vibration pattern for use by the electronic device to actuate haptic feedback signaling a predetermined notification event. | 12-06-2012 |
20130332070 | TOUCH-BASED EXPLORATION OF MAPS FOR SCREEN READER USERS - An electronic device can provide an interactive map with non-visual output, thereby making the map accessible to visually impaired users. The map can be based on a starting location defined based on a current location of the electronic device or on a location entered by the user. Nearby paths, nearby points of interest, or directions from the starting location to an ending location can be identified via audio output. Users can touch a screen of the electronic device in order to virtually explore a neighborhood. A user can be alerted when he is moving along or straying from a path, approaching an intersection or point of interest, or changing terrains. Thus, the user can familiarize himself with city-level spatial relationships without needing to physically explore unfamiliar surroundings. | 12-12-2013 |
20140210828 | ACCESSIBILITY TECHINQUES FOR PRESENTATION OF SYMBOLIC EXPRESSIONS - Methods for presenting symbolic expressions such as mathematical, scientific, or chemical expressions, formulas, or equations are performed by a computing device. One method includes: displaying a first portion of a symbolic expression within a first area of a display screen; while in a first state in which the first area is selected for aural presentation, aurally presenting first information related to the first portion of the symbolic expression; while in the first state, detecting particular user input; in response to detecting the particular user input, performing the steps of: transitioning from the first state to a second state in which a second area, of the display, is selected for aural presentation; determining second information associated with a second portion, of the symbolic expression, that is displayed within the second area; in response to determining the second information, aurally presenting the second information. | 07-31-2014 |
20140281997 | DEVICE, METHOD, AND GRAPHICAL USER INTERFACE FOR OUTPUTTING CAPTIONS - An electronic device outputs a first caption of a plurality of captions while a first segment of a video is being played, where the first video segment corresponds to the first caption. While outputting the first caption, the device receives a first user input. In response to receiving the first user input, the device determines a second caption in the plurality of captions, distinct from the first caption, that meets predefined caption selection criteria; determines a second segment of the video that corresponds to the second caption; sends instructions to change from playing the first segment of the video to playing the second segment of the video; and outputs the second caption. | 09-18-2014 |
20140282007 | VOICE CONTROL TO DIAGNOSE INADVERTENT ACTIVATION OF ACCESSIBILITY FEATURES - Methods and systems are provided for diagnosing inadvertent activation of user interface settings on an electronic device. The electronic device receives a user input indicating that the user is having difficulty operating the electronic device. The device then determines whether a setting was changed on the device within a predetermined time period prior to receiving the user input. When a first setting was changed within the predetermined time period prior to receiving the user input, the device restores the changed setting to a prior setting. | 09-18-2014 |
Patent application number | Description | Published |
20110310041 | Testing a Touch-Input Program - Methods and systems are disclosed that allow automated testing of an application program that is configured to receive a touch input. A testing mechanism can be configured to identify the touch input that is designed to produce a specified result. The testing mechanism can generate one or more signals simulating the touch input. The testing mechanism can then check the state of the user interface of the application program being tested and determine whether the actual result conforms to the specified result. | 12-22-2011 |
20120327009 | DEVICES, METHODS, AND GRAPHICAL USER INTERFACES FOR ACCESSIBILITY USING A TOUCH-SENSITIVE SURFACE - Disclosed herein are systems, methods, and non-transitory computer-readable storage media for operating a computing device having at least two user interface (UI) navigation modes capable of being concurrently activated in said device, and both UI navigation modes being responsive to a predefined set of touch gestures on a touch-sensitive display of the computing device. | 12-27-2012 |
20130238339 | HANDLING SPEECH SYNTHESIS OF CONTENT FOR MULTIPLE LANGUAGES - Techniques that enable a user to select, from among multiple languages, a language to be used for performing text-to-speech conversion. In some embodiments, upon determining that multiple languages may be used to perform text-to-speech conversion for a portion of text, the multiple languages may be displayed to the user. The user may then select a particular language to be used from the multiple languages. The portion of text may then be converted to speech in the user-selected language. | 09-12-2013 |
20130329924 | REMOTELY CONTROLLING A HEARING DEVICE - Systems, methods, and non-transitory computer-readable storage media are provided for remotely controlling a hearing device. A hearing device configured to communicate with a control device transmits status data, including settings, to the control device. The control device displays the status data in an interface configured to receive input specifying new settings, upon which a command is sent to the hearing device to change the current setting. The control device can automatically change the settings based on a determined current environment to be a stored program optimized to the current environment. The current environment can be determined based on the location of the hearing device or another device connected to the control device. Quick mode allows settings to be viewed and changed quickly by displaying multiple related settings as one and overriding interface buttons. Remote listen mode receives audio data from a microphone and transmit it to the hearing device. | 12-12-2013 |
20140380249 | VISUAL RECOGNITION OF GESTURES - Techniques that enable a user to interact with an electronic device using spatial gestures without touching the electronic device. An electronic device provides a contactless mode of operation during which a user can interact with the electronic device using touchless gestures. A touchless gesture may be used to indicate an action to be performed and also to set an action-related parameter value that is then used when the action is performed. | 12-25-2014 |
Patent application number | Description | Published |
20100309147 | Devices, Methods, and Graphical User Interfaces for Accessibility Using a Touch-Sensitive Surface - An accessibility method is performed by an electronic device with a display and a touch-sensitive surface. The method includes: displaying a plurality of user interface elements on the display; in response to detecting a first user interface navigation gesture by a finger on the touch-sensitive surface, navigating in the plurality of user interface elements in accordance with a current navigable unit type; in response to detecting a first user interface navigation setting gesture on the touch-sensitive surface: changing the current navigable unit type from the first navigable unit type to a second navigable unit type; and outputting accessibility information about the second navigable unit type; after changing the current navigable unit type, in response to detecting a second user interface navigation gesture by the finger on the touch-sensitive surface, navigating in the plurality of user interface elements in accordance with the current navigable unit type. | 12-09-2010 |
20100309148 | Devices, Methods, and Graphical User Interfaces for Accessibility Using a Touch-Sensitive Surface - An accessibility method is performed by an electronic device with a display and a touch-sensitive surface. The method includes: mapping at least a first portion of the display to the touch-sensitive surface; concurrently displaying a plurality of user interface containers on the display; detecting a user interface container selection event that selects a first user interface container in the plurality of user interface containers; and, in response to detecting the user interface container selection event: ceasing to map the first portion of the display to the touch-sensitive surface, and proportionally mapping the first user interface container to be substantially coextensive with the touch-sensitive surface. | 12-09-2010 |
20110298723 | Devices, Methods, and Graphical User Interfaces for Accessibility via a Touch-Sensitive Surface - An electronic device with a display and a touch-sensitive surface displays a plurality of user-selectable objects. A respective user-selectable object has a corresponding activation region on the touch-sensitive surface with an activation region size. The activation region size has a respective default size when a representative point for a finger contact is located outside the activation region. The activation region size has a respective expanded size when the representative point is located within the activation region. The device: detects movement of the finger contact across the touch-sensitive surface; in response, changes the size of the activation region for the respective user-selectable object between the respective default size and the respective expanded size in accordance with the movement of the finger contact; detects a user input when the representative point is located within the activation region for the respective user-selectable object; and, in response, performs a predefined operation. | 12-08-2011 |
20110302519 | Devices, Methods, and Graphical User Interfaces for Accessibility via a Touch-Sensitive Surface - An accessible electronic device with a display and a touch-sensitive surface: displays a first plurality of user-selectable objects; detects a finger contact on the touch-sensitive surface; detects movement of the finger contact across the touch sensitive surface to an activation region that corresponds to a first user-selectable object; while detecting the finger contact at the activation region, initiates output of audible accessibility information associated with the first user-selectable object; detects termination of the finger contact while the finger contact is at the activation region that corresponds to the first user-selectable object; and, in response: performs a predefined operation associated with the first user-selectable object if the device has output at least a predefined portion of the audible accessibility information associated with the first user-selectable object when the termination of the finger contact is detected; and forgoes performing the predefined operation otherwise. | 12-08-2011 |
20110304560 | Control Selection Approximation - A method includes displaying a user interface of an application on a device's touch-sensitive display. The user interface includes a plurality of regions, including a respective region at a respective hierarchy level. The respective region has two or more child regions at a hierarchy level below the respective hierarchy level. The method includes detecting a first contact at a location that corresponds to the respective region and that does not correspond to any of the two or more child regions. When the application is configured to process the first contact, not in conjunction with the respective region, but in conjunction with at least one child region of the two or more child regions, the method includes identifying a respective child region in accordance with positions of the child regions relative to the location, and processing the first contact in conjunction with the identified respective child region using the application. | 12-15-2011 |
20110307833 | Control Selection Approximation - A method includes displaying a user interface of an application on a device's touch-sensitive display. The user interface includes a plurality of regions, including a respective region at a respective hierarchy level. The respective region has two or more child regions at a hierarchy level below the respective hierarchy level. The method includes detecting a first contact at a location that corresponds to the respective region and that does not correspond to any of the two or more child regions. When the application is configured to process the first contact, not in conjunction with the respective region, but in conjunction with at least one child region of the two or more child regions, the method includes identifying a respective child region in accordance with positions of the child regions relative to the location, and processing the first contact in conjunction with the identified respective child region using the application. | 12-15-2011 |
20120306748 | Devices, Methods, and Graphical User Interfaces for Providing Control of a Touch-Based User Interface Absent Physical Touch Capabilities - An electronic device with a display and a touch-sensitive surface displays, on the display, a first visual indicator that corresponds to a virtual touch. The device receives a first input from an adaptive input device. In response to receiving the first input from the adaptive input device, the device displays a first menu on the display. The first menu includes a virtual touches selection icon. In response to detecting selection of the virtual touches selection icon, a menu of virtual multitouch contacts is displayed. | 12-06-2012 |
20120311508 | Devices, Methods, and Graphical User Interfaces for Providing Accessibility Using a Touch-Sensitive Surface - An electronic device presents a first user interface element of a first type and a second user interface element of a second type. In a sighted mode, the device detects a first interaction with the first user interface element, and performs an operation in accordance with sighted-mode gesture responses for the first user interface element. The device detects a second interaction with the second user interface element, and performs an operation in accordance with sighted-mode gesture responses for the second user interface element. In an accessible mode, the device detects a third interaction with the first user interface element, and performs an operation in accordance with accessible-mode gesture responses for the first user interface element. The device detects a series of interactions with the second user interface element; and, for each interaction, performs an operation in accordance with the sighted-mode gesture responses for the second user interface element. | 12-06-2012 |
20130212522 | Device, Method, and Graphical User Interface for Adjusting Partially Off-Screen Windows - An electronic device with a display bounded by a plurality of edges: displays a first portion of a first window on the display while not displaying a remaining portion of the first window on the display, wherein: the remaining portion of the first window extends in a virtual sense beyond at least one edge of the display; detects a first input that positions a cursor at a location on the display, the location being: over the displayed first portion of the first window, and within a predefined distance of an edge of the display; in response to detecting the first input that positions the cursor at the location on the display, activates a window adjustment mode; while the window adjustment mode is active, detects a second input; and, in response to detecting the second input, adjusts the first window in accordance with the second input. | 08-15-2013 |
20130311921 | Devices, Methods, and Graphical User Interfaces for Accessibility Using a Touch-Sensitive Surface - An accessibility method is performed by an electronic device with a display and a touch-sensitive surface. The method includes: displaying a plurality of user interface elements on the display; in response to detecting a first user interface navigation gesture by a finger on the touch-sensitive surface, navigating in the plurality of user interface elements in accordance with a current navigable unit type; in response to detecting a first user interface navigation setting gesture on the touch-sensitive surface: changing the current navigable unit type from the first navigable unit type to a second navigable unit type; and outputting accessibility information about the second navigable unit type; after changing the current navigable unit type, in response to detecting a second user interface navigation gesture by the finger on the touch-sensitive surface, navigating in the plurality of user interface elements in accordance with the current navigable unit type. | 11-21-2013 |
20140006030 | Device, Method, and User Interface for Voice-Activated Navigation and Browsing of a Document | 01-02-2014 |