Patent application number | Description | Published |
20120113023 | Device, Method, and Graphical User Interface for Manipulating Soft Keyboards - A method includes, at an electronic device with a display and a touch-sensitive surface: concurrently displaying a first text entry area and an unsplit keyboard on the display; detecting a gesture on the touch-sensitive surface; and, in response to detecting the gesture on the touch-sensitive surface, replacing the unsplit keyboard with an integrated input area. The integrated input area includes a left portion with a left side of a split keyboard, a right portion with a right side of the split keyboard, and a center portion in between the left portion and the right portion. | 05-10-2012 |
20120113024 | Device, Method, and Graphical User Interface for Manipulating Soft Keyboards - A method includes, at an electronic device with a display and a touch-sensitive surface: concurrently displaying a first text entry area and an unsplit keyboard on the display; detecting a gesture on the touch-sensitive surface; and, in response to detecting the gesture on the touch-sensitive surface, replacing the unsplit keyboard with an integrated input area. The integrated input area includes a left portion with a left side of a split keyboard, a right portion with a right side of the split keyboard, and a center portion in between the left portion and the right portion. | 05-10-2012 |
20120113025 | Device, Method, and Graphical User Interface for Manipulating Soft Keyboards - A method includes, at an electronic device with a display and a touch-sensitive surface: concurrently displaying a first text entry area and an unsplit keyboard on the display; detecting a gesture on the touch-sensitive surface; and, in response to detecting the gesture on the touch-sensitive surface, replacing the unsplit keyboard with an integrated input area. The integrated input area includes a left portion with a left side of a split keyboard, a right portion with a right side of the split keyboard, and a center portion in between the left portion and the right portion. | 05-10-2012 |
20120113126 | Device, Method, and Graphical User Interface for Manipulating Soft Keyboards - A method includes, at an electronic device with a display and a touch-sensitive surface: concurrently displaying a first text entry area and an unsplit keyboard on the display; detecting a gesture on the touch-sensitive surface; and, in response to detecting the gesture on the touch-sensitive surface, replacing the unsplit keyboard with an integrated input area. The integrated input area includes a left portion with a left side of a split keyboard, a right portion with a right side of the split keyboard, and a center portion in between the left portion and the right portion. | 05-10-2012 |
20120117501 | Device, Method, and Graphical User Interface for Manipulating Soft Keyboards - A method includes, at an electronic device with a display and a touch-sensitive surface: concurrently displaying a first text entry area and an unsplit keyboard on the display; detecting a gesture on the touch-sensitive surface; and, in response to detecting the gesture on the touch-sensitive surface, replacing the unsplit keyboard with an integrated input area. The integrated input area includes a left portion with a left side of a split keyboard, a right portion with a right side of the split keyboard, and a center portion in between the left portion and the right portion. | 05-10-2012 |
20120117505 | Device, Method, and Graphical User Interface for Manipulating Soft Keyboards - A method includes, at an electronic device with a display and a touch-sensitive surface: concurrently displaying a first text entry area and an unsplit keyboard on the display; detecting a gesture on the touch-sensitive surface; and, in response to detecting the gesture on the touch-sensitive surface, replacing the unsplit keyboard with an integrated input area. The integrated input area includes a left portion with a left side of a split keyboard, a right portion with a right side of the split keyboard, and a center portion in between the left portion and the right portion. | 05-10-2012 |
20120117506 | Device, Method, and Graphical User Interface for Manipulating Soft Keyboards - A method includes, at an electronic device with a display and a touch-sensitive surface: concurrently displaying a first text entry area and an unsplit keyboard on the display; detecting a gesture on the touch-sensitive surface; and, in response to detecting the gesture on the touch-sensitive surface, replacing the unsplit keyboard with an integrated input area. The integrated input area includes a left portion with a left side of a split keyboard, a right portion with a right side of the split keyboard, and a center portion in between the left portion and the right portion. | 05-10-2012 |
20120306747 | Device, Method, and Graphical User Interface for Entering Alternate Characters with a Physical Keyboard - A device displays a text entry area with an insertion point and detects activation of a first physical key in a physical keyboard. In response to a determination that the activation of the first physical key lasts more than a first predefined time period, the device displays a character selection area; while displaying the character selection area, the device detects activation of a second physical key in the physical keyboard; in response to a determination that the activated second physical key is an arrow key, the device moves a current character focus in accordance with a direction of the arrow key; and, in response to a determination that the activated second physical key is the first physical key, the device enters in the text entry area a single instance of a character that has the current character focus, and ceases to display the character selection area. | 12-06-2012 |
20120307096 | Metadata-Assisted Image Filters - This disclosure pertains to devices, methods, systems, and computer readable media for generating and/or interpreting image metadata to determine input parameters for various image processing routines, e.g., filters that distort or enhance an image, in a way that provides an intuitive experience for both the user and the software developer. Such techniques may attach the metadata to image frames and then send the image frames down an image processing pipeline to one or more image processing routines. Image metadata may include face location information, and the image processing routine may include an image filter that processes the image metadata in order to keep the central focus (or foci) of the image filter substantially coincident with one or more of the faces represented in the face location information. The generated and/or interpreted metadata may also be saved to a metadata track for later application to unfiltered image data. | 12-06-2012 |
20130263055 | Device, Method, and Graphical User Interface for Manipulating User Interface Objects - A computing device with a touch screen display simultaneously displays on the touch screen display a plurality of user interface objects and at least one destination object. The computing device detects a first input by a user on a destination object displayed on the touch screen display. While continuing to detect the first input by the user on the destination object, the computing device detects a second input by the user on a first user interface object displayed on the touch screen display. In response to detecting the second input by the user on the first user interface object, the computing device performs an action on the first user interface object. The action is associated with the destination object. | 10-03-2013 |
20130265267 | Device, Method, and Graphical User Interface for Manipulating User Interface Objects - A computing device with a touch screen display simultaneously displays on the touch screen display a plurality of user interface objects and at least one destination object. The computing device detects a first input by a user on a destination object displayed on the touch screen display. While continuing to detect the first input by the user on the destination object, the computing device detects a second input by the user on a first user interface object displayed on the touch screen display. In response to detecting the second input by the user on the first user interface object, the computing device performs an action on the first user interface object. The action is associated with the destination object. | 10-10-2013 |
20140062873 | INSERTION MARKER PLACEMENT ON TOUCH SENSITIVE DISPLAY - In accordance with some embodiments, a computer-implemented method is performed at a portable electronic device with a touch screen display. The method can include: displaying graphics on the touch screen display, detecting a finger contact on the touch screen display, and, in response to the detected finger contact, inserting an insertion marker in the graphics at a first location. The method can further include detecting a finger movement on the touch screen display and, irrespective of initial distance from finger to insertion marker on the touch screen display, moving the insertion marker in accordance with the detected finger movement from the first location to a second location in the graphics. | 03-06-2014 |
20140351707 | DEVICE, METHOD, AND GRAPHICAL USER INTERFACE FOR MANIPULATING WORKSPACE VIEWS - In some embodiments, a multifunction device with a display and a touch-sensitive surface creates a plurality of workspace views. A respective workspace view is configured to contain content assigned by a user to the respective workspace view. The content includes application windows. The device displays a first workspace view in the plurality of workspace views on the display without displaying other workspace views in the plurality of workspace views and detects a first multifinger gesture on the touch-sensitive surface. In response to detecting the first multifinger gesture on the touch-sensitive surface, the device replaces display of the first workspace view with concurrent display of the plurality of workspace views. | 11-27-2014 |
20150062052 | Device, Method, and Graphical User Interface for Transitioning Between Display States in Response to a Gesture - An electronic device displays a user interface in a first display state. The device detects a first portion of a gesture on a touch-sensitive surface, including detecting intensity of a respective contact of the gesture. In response to detecting the first portion of the gesture, the device displays an intermediate display state between the first display state and a second display state. In response to detecting the end of the gesture: if intensity of the respective contact had reached a predefined intensity threshold prior to the end of the gesture, the device displays the second display state; otherwise, the device redisplays the first display state. After displaying an animated transition between a first display state and a second state, the device, optionally, detects an increase of the contact intensity. In response, the device displays a continuation of the animation in accordance with the increasing intensity of the respective contact. | 03-05-2015 |
20150067560 | Device, Method, and Graphical User Interface for Manipulating Framed Graphical Objects - An electronic device with a touch-sensitive surface, a display, and one or more sensors to detect intensity of contacts with the touch-sensitive surface displays a graphical object inside of a frame on the display, and detects a gesture. Detecting the gesture includes: detecting a contact on the touch-sensitive surface while a focus selector is over the graphical object, and detecting movement of the contact across the touch-sensitive surface. In response to detecting the gesture: in accordance with a determination that the contact meets predefined intensity criteria, the device removes the graphical object from the frame; and in accordance with a determination that the contact does not meet the predefined intensity criteria, the device adjusts an appearance of the graphical object inside of the frame. | 03-05-2015 |
20150067596 | Device, Method, and Graphical User Interface for Displaying Additional Information in Response to a User Contact - An electronic device, with a touch-sensitive surface and a display, includes one or more sensors to detect intensity of contacts with the touch-sensitive surface. The device detects a contact on the touch-sensitive surface while a focus selector corresponding to the contact is at a respective location on the display associated with additional information not initially displayed on the display. While the focus selector is at the respective location, upon determining that the contact has an intensity above a respective intensity threshold before a predefined delay time has elapsed with the focus selector at the respective location, the device displays the additional information associated with the respective location without waiting until the predefined delay time has elapsed; and upon determining that the contact has an intensity below the respective intensity threshold, the device waits until the predefined delay time has elapsed to display the additional information associated with the respective location. | 03-05-2015 |
20150067602 | Device, Method, and Graphical User Interface for Selecting User Interface Objects - An electronic device with a display, touch-sensitive surface and one or more sensors to detect intensity of contacts with the touch-sensitive surface displays a first user interface object and detects first movement of the contact that corresponds to movement of a focus selector toward the first user interface object. In response to detecting the first movement, the device moves the focus selector to the first user interface object; and determines an intensity of the contact. After detecting the first movement, the device detects second movement of the contact. In response to detecting the second movement of the contact, when the contact meets selection criteria based on an intensity of the contact, the device moves the focus selector and the first user interface object; and when the contact does not meet the selection criteria, the device moves the focus selector without moving the first user interface object. | 03-05-2015 |