Inventors list

Assignees list

Classification tree browser

Top 100 Inventors

Top 100 Assignees


Gesture-based

Subclass of:

715 - Data processing: presentation processing of document, operator interface processing, and screen saver display processing

715700000 - OPERATOR INTERFACE (E.G., GRAPHICAL USER INTERFACE)

Patent class list (only not empty are listed)

Deeper subclasses:

Entries
DocumentTitleDate
20130031514Gestures for Presentation of Different Views of a System Diagram - Presenting different views of a system based on input from a user. A first view of a first portion of the system may be displayed. For example, the first portion may be a device of the system. User input specifying a first gesture may be received. In response to the first gesture, a second view of the first portion of the system may be displayed. For example, the first view may represent a first level of abstraction of the portion of the system and the second view may represent a second level of abstraction of the portion of the system. A second gesture may be used to view a view of a different portion of the system. Additionally, when changing from a first view to a second view, the first view may “morph” into the second view.01-31-2013
20110185321Device, Method, and Graphical User Interface for Precise Positioning of Objects - A method includes, at a computing device with a touch-sensitive display: displaying a user interface object on the touch-sensitive display; detecting a contact on the user interface object; while continuing to detect the contact on the user interface object: detecting an M-finger gesture, distinct from the contact, in a first direction on the touch-sensitive display, where M is an integer; and, in response to detecting the M-finger gesture, translating the user interface object a predefined number of pixels in a direction in accordance with the first direction.07-28-2011
20110185317Device, Method, and Graphical User Interface for Resizing User Interface Content - Aspect ratio locking alignment guides for gestures are disclosed. In one embodiment, a gesture is detected to resize a user interface element, and in response, a first alignment guide is visibly displayed, wherein the first alignment guide includes positions representing different sizes the user interface element can be resized to while maintaining the initial aspect ratio of the user interface element. While the user interface element is resized in accordance with the user gesture, and while the first alignment guide is visibly displayed: when the user gesture is substantially aligned with the first alignment guide, visible display of the first alignment guide is maintained; and when the user gesture substantially deviates from the first alignment guide, visible display of the first alignment guide is terminated.07-28-2011
20130031517HAND POSE INTERACTION - Provided is a method of hand pose interaction. The method recognizes a user input related to selection of an object displayed on a computing device and displays a graphical user interface (GUI) corresponding to the object. The graphical user interface comprises at least one representation of a hand pose, wherein each representation of a hand pose corresponds to a unique function associated with the object. Upon recognition of a user hand pose corresponding to a hand pose representation in the graphical user interface, the function associated with the hand pose representation is executed.01-31-2013
20130031516IMAGE PROCESSING APPARATUS HAVING TOUCH PANEL - An image processing apparatus includes an operation panel as an example of a touch panel and a display device, as well as CPU as an example of a processing unit for performing processing based on a contact. CPU includes a first identifying unit for identifying a file to be processed, a second identifying unit for identifying an operation to be executed, a determination unit for determining whether or not the combination of the file and operation as identified is appropriate, and a display unit for displaying a determination result. In the case where one of the identifying units previously detects a corresponding gesture to identify the file or the operation, and when a gesture corresponding to the other identifying unit is detected next, then the determination result is displayed on the display device before identification of the file or the operation is completed by the gesture.01-31-2013
20130031515Method And Apparatus For Area-Efficient Graphical User Interface - A GUI screen image is a standard screen image, and displays a first combined GUI area, which is a combination of a GUI of the directional keys and a GUI of a joystick, and a second combined GUI area, which is a combination of a GUI of the four-type operation buttons and a GUI of a joystick, at the lower left and at the lower right on the screen image, respectively. Depending on an area in the first combined GUI area or in the second combined GUI area to which a user newly touches, which of the combined GUI to be used is determined and a screen image is switched, and if a finger or a thumb detaches, the screen image switches back.01-31-2013
20110209100MULTI-SCREEN PINCH AND EXPAND GESTURES - Embodiments of multi-screen pinch and expand gestures are described. In various embodiments, a first input is recognized at a first screen of a multi-screen system, and the first input includes a first motion input. A second input is recognized at a second screen of the multi-screen system, and the second input includes a second motion input. A pinch gesture or an expand gesture can then be determined from the first and second motion inputs that are associated with the recognized first and second inputs.08-25-2011
20120174044INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND COMPUTER PROGRAM - Provided is an information processing apparatus including a display unit, provided on an apparatus front-surface side, for displaying information, a first detection unit, provided on an apparatus back-surface side, for detecting an operation input to a back surface, a second detection unit, provided on the apparatus front-surface side, for detecting an operation input to the display unit, and an operation input information determination unit for causing a function corresponding to the operation inputs to be executed, based on detection results of the first detection unit and the second detection unit. When the operation input is detected by the first detection unit and the operation input for operating an object displayed on the display unit is detected by the second detection unit, the operation input information determination unit executes the function corresponding to the operation inputs detected by the first detection unit and the second detection unit.07-05-2012
20120174042METHOD FOR UNLOCKING SCREEN AND EXECUTING APPLICATION PROGRAM - A method for unlocking screen and executing application program is provided. The method is adapted to a mobile device having a touch screen. Under a screen lock mode of the mobile device, the touch screen is used to detect a touch and drag operation of a user. Then, it is determined whether a start point of the touch and drag operation is located within a predetermined region and a dragging distance of the touch and drag operation along a predetermined path is over a predetermined distance. If yes, it is further determined whether an end point of the touch and drag operation is located within one of a plurality of segmented regions of the touch screen, which respectively correspond to a plurality of application programs. If yes, the screen is unlocked and the application program corresponding to segmented region where the end point is located is executed simultaneously.07-05-2012
20100017758PROCESSING FOR DISTINGUISHING PEN GESTURES AND DYNAMIC SELF-CALIBRATION OF PEN-BASED COMPUTING SYSTEMS - Systems, methods, and computer-readable media process and distinguish user input device gestures, such as gestures input via a pen in a pen-based computing system, e.g., to quickly and reliably distinguish between electronic ink entry, single taps, double taps, press-and-hold actions, dragging operations, and the like. Systems, methods, and computer-readable media also are provided for dynamically calibrating a computer system, e.g., calibrating a displayed input panel view based on input data recognized and received by a digitizer. Such systems and methods may operate without entering a dedicated or special calibration application, program, or routine.01-21-2010
20090193366GRAPHICAL USER INTERFACE FOR LARGE-SCALE, MULTI-USER, MULTI-TOUCH SYSTEMS - A method implemented on the graphical user interface device to invoke an independent, user-localized menu in an application environment, by making a predetermined gesture with a pointing device on an arbitrary part of a display screen or surface, especially when applied in a multi-touch, multi-user environment, and in environments where multiple concurrent pointing devices are present. As an example, the user may trace out a closed loop of a specific size that invokes a default system menu at any location on the surface, even when a second user may be operating a different portion of the system elsewhere on the same surface. As an additional aspect of the invention, the method allows the user to smoothly transition between the menu-invocation and menu control.07-30-2009
20120266109MULTI-DIMENSIONAL BOUNDARY EFFECTS - Multi-dimensional boundary effects provide visual feedback to indicate that boundaries in user interface elements (e.g., web pages, documents, images, or other elements that can be navigated in more than one dimension) have been reached or exceeded (e.g., during horizontal scrolling, vertical scrolling, diagonal scrolling, or other types of movement). A compression effect can be displayed to indicate that movement has caused one or more boundaries (e.g., a horizontal boundary and/or a vertical boundary) of a UI element to be exceeded. Exemplary compression effects include compressing content along a vertical axis when a vertical boundary has been exceeded and compressing content along a horizontal axis when a horizontal boundary has been exceeded.10-18-2012
20080256494TOUCHLESS HAND GESTURE DEVICE CONTROLLER - A simple user interface for touchless control of electrically operated equipment. Unlike other systems which depend on distance to the sensor or sensor selection this system depends on hand and or finger motions, a hand wave in a certain direction, or a flick of the hand in one area, or holding the hand in one area or pointing with one finger for example. The device is based on optical pattern recognition using a solid state optical matrix sensor with a lens to detect hand motions. This sensor is then connected to a digital image processor, which interprets the patterns of motion and outputs the results as signals to control fixtures, appliances, machinery, or any device controllable through electrical signals.10-16-2008
20100077361Method of displaying multiple points of interest on a personal navigation device - A method of displaying points of interest in a personal navigation device includes displaying a map on a display of the personal navigation device, receiving touch input at a touched position of the display, and searching an area within a search radius of the touched position for points of interest. The method also includes displaying points of interest located in the area within the search radius, where the found points of interest are represented by icons connected to their locations on the map with a line extending out from the touched position, and spreading out the icons around the touched position to separate the icons from each other.03-25-2010
20100115473ASSOCIATING GESTURES ON A TOUCH SCREEN WITH CHARACTERS - The present invention provides methods for associating a gesture, in contact with a touch screen, with a character. More specifically, the present invention links a user's movement on a surface of a device to represent a character. A character includes any number, letter, or symbol. For example, an illustrative embodiment of the present invention, a user may swipe a surface on their device such as a cell phone. The present invention recognizes the swipe to represent the number “0,” a swipe in another direction to represent the number “1,” a tap in the middle region to represent the number “2,” etc.05-06-2010
20130086533DEVICE FOR INTERACTING WITH REAL-TIME STREAMS OF CONTENT - An end-user system (04-04-2013
20130086531COMMAND ISSUING DEVICE, METHOD AND COMPUTER PROGRAM PRODUCT - According to an embodiment, a command issuing device includes an acquiring unit configured to acquire a moving image by capturing a hand of an operator; a projection area recognizer configured to recognize a projection area of a projection finger in the moving image; a projector configured to project one of pictures of a graphical user interface (GUI) onto the projection area; an operation area recognizer configured to recognize an operation area of an operation finger in the moving image; a selection determining unit configured to measure an overlapping degree of the projection area and the operation area to determine whether or not the GUI is selected; and a issuing unit configured to issue a command associated with the GUI when it is determined that the GUI is selected.04-04-2013
20130086532TOUCH DEVICE GESTURES - A system and method for facilitating employing touch gestures to control or manipulate a web-based application. The example method includes employing a browser running on a device with a touch-sensitive display to access content provided via a website; determining a context associated with the content, including ascertaining one or more user interface controls to be presented via a display screen used to present the content, and providing a first signal in response thereto; receiving touch input from a touch-sensitive display and providing a second signal in response thereto; and using the second signal to manipulate the display screen in accordance with the context associated with the content presented via the display screen. A library of touch gestures can represent common functions through touch movement patterns. These gestures may be context sensitive so as not to conflict with default touch tablet gestures.04-04-2013
20130036389COMMAND ISSUING APPARATUS, COMMAND ISSUING METHOD, AND COMPUTER PROGRAM PRODUCT - According to an embodiment, a command issuing apparatus includes an acquiring unit configured to acquire an image obtained by capturing a subject; a detector configured to detect a specific region of the subject from the image; a first setting unit configured to set a specific position indicating a position of the specific region; a second setting unit configured to set a reference position indicating a position that is to be a reference in the image; a first calculator configured to calculate a position vector directing toward the specific position from the reference position; a second calculator configured to calculate, for each of a plurality of command vectors respectively corresponding to predetermined commands, a first parameter indicating a degree of coincidence between the command vector and the position vector; and an issuing unit configured to issue the command based on the first parameter.02-07-2013
20100042954Motion based input selection - A method for selecting an input value based on sensed motion is provided. In one embodiment, the method includes varying a graphical element displayed on a handheld device in response to sensed motion to identify an input value. The motion-based input may be used to perform a function on the handheld device or on an external device. For example, the input may be used to open a lock or to rotate a displayed image. Various additional methods, devices, and systems employing motion-based inputs are also provided.02-18-2010
20090158220DYNAMIC THREE-DIMENSIONAL OBJECT MAPPING FOR USER-DEFINED CONTROL DEVICE - A computer-implemented method is provided to interactively capture and utilize a three-dimensional object as a controlling device for a computer system. One operation of the method is capturing depth data of the three-dimensional object. In another operation, the depth data of the three-dimensional object undergoes processing to create geometric defining parameters for the three-dimensional object. The method can also include defining correlations between particular actions performed with the three-dimensional object and particular actions to be performed by the computer system. The method also includes an operation to save the geometric defining parameters of the three-dimensional object to a recognized object database. In another operation, the correlations between particular actions performed with the three-dimensional object and particular actions to be performed by the computer system in response to recognizing the particular actions are also saved to the recognized object database.06-18-2009
20090158219ENGINE SUPPORT FOR PARSING CORRECTION USER INTERFACES - A parsing system provides a parsed document to a user application labeling the document with indication symbols according to a scheme associated with the parsing results. Users are enabled to insert correction indicators such as handwritten gestures, icon selections, menu item selections, and the like in conjunction with the indication symbols. The document is re-analyzed performing the requested corrections such as line or block separations, line, block, word connections, etc. The operations provide support for the engine stack of the parsing system while accommodating independent user interfaces employed by the users. Insertion of correction indicators and subsequent re-analysis for correction may be performed upon user signal, in an iterative manner, or continuously.06-18-2009
20100333044Gesture-based Interface System and Method - Systems and methods of manipulating display parameters of displayed images, and optionally designating the images for manipulation, via a gesture pad.12-30-2010
20130042209System and Method for Providing Direct Access to an Application when Unlocking a Consumer Electronic Device - A consumer electronic device has an orientation sensor and a lock control. The orientation sensor outputs signals identifying the orientation of the device, while the lock control to allow a user to move the device from a locked state to an unlocked state. The device also includes a plurality of application programs stored in memory. Responsive to the user unlocking the device, a controller will launch a selected application program. The application that is launched by the device is based on an orientation of the device.02-14-2013
20120216151Using Gestures to Schedule and Manage Meetings - Techniques and configurations for an apparatus are provided for creating and managing meetings using gestures. Movements of a user's hand in a three-dimensional space are detected. The hand movements in the three-dimensional space are interpreted to identify a gesture intended by the user to set up or manage a meeting among a plurality of persons. An electronic command is generated from the detected gesture to set up or manage the meeting.08-23-2012
20120185806INTERACTION DATA DISPLAY APPARATUS, PROCESSING APPARATUS AND METHOD FOR DISPLAYING THE INTERACTION DATA - Disclosed herein is a system for keeping sensing of mass data of members of an organization with respect to states of their meeting and activities and analyzing and evaluating their interactions according to those sensor data. Interaction data includes first information denoting whether or not a terminal unit has faced a different terminal unit and second information denoting a state of the terminal unit and excluding information denoting positions of the terminal unit and the first information. An interaction data display apparatus includes a receiving unit for receiving interaction data from the terminal unit and a display unit for displaying the received interaction data. The display unit displays the first and second information items included in the interaction data received by the receiving unit so as to be related to each other according to times at which those first and second information items are obtained respectively.07-19-2012
20090125848TOUCH SURFACE-SENSITIVE EDIT SYSTEM - A method, medium and implementing processing system are provided in which displayed text is manipulated using two fingers within an editing application to select a region of text or objects. In an example, two fingers are placed on a touch-sensitive display or touch pad and the region of text between the fingers is selected. The selected text can be manipulated as otherwise selected text is currently manipulated, e.g. cut, paste and copy functions can be performed. The movement of the fingers also performs this manipulation. In one example, if the fingers are brought to together, the selected text is cut, or a split screen could occur. If the fingers are placed together and then parted, the action would be to part the text to make room for a picture or other insert.05-14-2009
20090125849Eye Tracker with Visual Feedback - The present invention relates to entry of control commands into a computer in response to eye-tracker detected movement sequences of a point of regard over a graphical display, which is associated with the computer. A processing module in the computer causes the display to present graphical feedback information in the form of a data-manipulating window, which visually confirms any entered control commands. The data-manipulating window is presented at a position relative to an active control object on the display, such that a center point of the window is located within a relatively small offset distance from a center point of the active control object. The window includes graphical information, which symbolizes and activity portion of the display presently being the object of an eye-tracker-controlled entry of control commands. Moreover the information in the window is repeatedly updated in response to the eye-tracker-controlled entry of control commands.05-14-2009
20100095251LINKAGE BETWEEN MOTION SENSING AND POSITION APPLICATIONS IN A PORTABLE COMMUNICATION DEVICE - Criteria for movement of a mobile communication device that can be initiated by the user are defined. A criterion can be stored as a data characteristic in device memory. Motion of the device can be sensed to determine, by the device controller, whether sensed motion meets the defined criterion. The sensed motion may be derived from an accelerometer, or equivalent means, in the device. If the sensed motion is determined by the controller to match stored criterion data, the controller triggers activation of an application that is dependent on location of the device. A stored application associated with the matched data characteristic is accessed from one or more stored applications respectively associated in memory with stored data characteristics.04-15-2010
20100095250Facilitating Interaction With An Application - An apparatus for facilitating interaction with an application includes a memory and logic. The memory stores image data generated by an instance of an application. The logic repeats the following for each user of a number of users: receives a sensor signal representing a gesture performed by a user and indicating a user instruction; modifies the image data according to the user instruction; and sends the image data to initiate a display of an image according to the user instruction.04-15-2010
20130047126SWITCHING BACK TO A PREVIOUSLY-INTERACTED-WITH APPLICATION - This document describes techniques and apparatuses for switching back to a previously-interacted-with application. In some embodiments, these techniques and apparatuses enable selection of a user interface not currently exposed on a display through a simple gesture that is both easy-to-use and remember.02-21-2013
20130047125TOUCHSCREEN GESTURES FOR VIRTUAL BOOKMARKING OF PAGES - A system and method are disclosed for navigating an electronic document using a touch-sensitive display screen with gestures that are reminiscent of physically handling the pages of a conventional, bound document. A user may temporarily bookmark one or more selected pages by touching the touchscreen with a finger when the pages are displayed, to mimic using a finger to hold a selected page of a conventional, bound document. Predefined gestures may be specified with different functions, such as returning to a bookmarked page or removing a bookmark.02-21-2013
20100070932VEHICLE ON-BOARD DEVICE - A vehicle on-board device includes a user interface device and a processing section. The user interface device is mounted inside of a vehicle, and configured and arranged to output information to a user and to receive a user input. The processing section is operatively coupled to the user interface device, and configured to perform a prescribed function in response to a prescribed user operation received by the user input interface device. The processing section is further configured to perform an interactive tutorial control to provide the user with at least one interactive instruction for the prescribed function in which the processing section prompts the user to input the prescribed user operation, determines whether the user input received by the user interface device matches the prescribed user operation, and completes the interactive learning control when the user input matches the prescribed user operation.03-18-2010
20090313587METHOD AND APPARATUS FOR PROVIDING MOTION ACTIVATED UPDATING OF WEATHER INFORMATION - An approach provides updating of weather information on a mobile device. Motion of a mobile device is detected, wherein the mobile device is configured to execute a weather application for presenting weather information to a user. Update of the weather information is retrieved in response to the detected motion.12-17-2009
20120192118Device, Method, and Graphical User Interface for Navigating through an Electronic Document - An electronic device with a display and a touch-sensitive surface stores a document having primary content, supplementary content, and user-generated content. The device displays a representation of the document in a segmented user interface on the display. Primary content of the document is displayed in a first segment of the segmented user interface and supplementary content of the document is concurrently displayed in a second segment of the segmented user interface distinct from the first segment. The device receives a request to view user-generated content of the document. In response to the request, the device maintains display of the previously displayed primary content, ceases to display at least a portion of the previously displayed supplementary content, and displays user-generated content of the document in a third segment of the segmented user interface distinct from the first segment and the second segment.07-26-2012
20120192117Device, Method, and Graphical User Interface with a Dynamic Gesture Disambiguation Threshold - An electronic device with a display, a touch-sensitive surface, one or more processors, and memory detects a first portion of a gesture, and determines that the first portion has a first gesture characteristic. The device selects a dynamic disambiguation threshold in accordance with the first gesture characteristic. The dynamic disambiguation threshold is used to determine whether to perform a first type of operation or a second type of operation when a first kind of gesture is detected. The device determines that the gesture is of the first kind of gesture. After selecting the dynamic disambiguation threshold, the device determines whether the gesture meets the dynamic disambiguation threshold. When the gesture meets the dynamic disambiguation threshold, the device performs the first type of operation, and when the gesture does not meet the dynamic disambiguation threshold, the device performs the second type of operation.07-26-2012
20120192116Pinch Zoom Velocity Detent - An information handling system includes a gesture sensitive interface and a processor. The processor is configured to receive inputs from the gesture sensitive interface corresponding to first and second interaction points, determine a relative motion between the first and second interaction points, and obtain a velocity of the relative motion. The processor is further configured to determine if the velocity exceeds a threshold, and scale an image on a display from an initial magnification to a predetermined magnification when the velocity exceeds a threshold.07-26-2012
20090094561Displaying Personalized Documents To Users Of A Surface Computer - Methods, apparatus, and products are disclosed for displaying personalized documents to users of a surface computer, the surface computer comprising a surface, surface computer capable receiving multi-touch input through the surface and rendering display output on the surface, that include: registering a plurality of users with the surface computer; assigning, to each registered user, a portion of the surface for interaction between that registered user and the surface computer; selecting a user profile for each registered user; creating, for each registered user from a content repository, personalized display content for that registered user in dependence upon the user profile selected for that registered user; and rendering the personalized display content for each registered user on that user's assigned portion of the surface.04-09-2009
20130074015SYSTEM TO SCROLL TEXT ON A SMALL DISPLAY SCREEN - The present invention is a system to scroll text on a small display screen that includes a screen finger sensor that detects when a selected one of a user's finger and a stylus is dragged across the small display screen and a page, sets a scroll speed, will scroll to an edge of the page and move down a sentence and repeat to scroll, reaches a bottom of the page and will move to a first sentence of a next consecutive page. The system also includes a logic controller that detects the text is displayed and sets the screen finger sensor to await activation of the system and is typically a software module or a downloaded app.03-21-2013
20130074014COLLABORATIVE GESTURE-BASED INPUT LANGUAGE - In one example, a method includes receiving, by a server, data representative of a group of gestures detected by the plurality of computing devices and data representative of one or more shortcuts associated with the group of gestures from a plurality of computing devices, wherein each shortcut corresponds to an action performed by at least one of the computing devices. The method may further include aggregating, by the server, the data representative of the gestures and the data representative of the associated shortcuts received from the plurality of computing devices based at least in part on detected similarities between at least one of 1) the group of gestures and 2) the associated shortcuts, and defining, by the server, a gesture-shortcut language based at least in part on the aggregated data.03-21-2013
20130067420Semantic Zoom Gestures - Semantic zoom techniques are described. In one or more implementations, techniques are described that may be utilized by a user to navigate to content of interest. These techniques may also include a variety of different features, such as to support semantic swaps and zooming “in” and “out.” These techniques may also include a variety of different input features, such as to support gestures, cursor-control device, and keyboard inputs. A variety of other features are also supported as further described in the detailed description and figures.03-14-2013
20120117517USER INTERFACE - A method of controlling a user interface comprising the steps of: detecting movement of a contact continuously between a first contact point at which contact is made and a second contact point at which contact is released; determining a state of a selection/de-selection mode of operation in dependence on: a line traced between the first contact point and the second contact point traversing one of more objects of an application layer; the first contact point not being co-incident with an object of the application layer; and the second contact point not being coincident with an object of the application layer.05-10-2012
20130067422TERMINAL CAPABLE OF CONTROLLING ATTRIBUTE OF APPLICATION BASED ON MOTION AND METHOD THEREOF - A terminal controls an application attribute based on a motion. The terminal includes a motion measurement unit, an attribute mapping unit, an attribute control unit, and a controller. The motion measurement unit measures a motion of the terminal. The attribute mapping unit classifies motion directions of the terminal, classifies a motion degree, and maps an attribute type and a control strength of an application installed in the terminal in response to each motion direction and each motion degree. The attribute control unit activates an attribute control of the application based on a motion of the terminal. When the attribute control of the application is activated by the attribute control unit and the motion of the terminal is measured by the motion measurement unit, the controller controls an attribute of the application based on the attribute type and the control strength mapped by the attribute mapping unit.03-14-2013
20130067419Gesture-Enabled Settings - Techniques (03-14-2013
20110022992METHOD FOR MODIFYING A REPRESENTATION BASED UPON A USER INSTRUCTION - The invention relates to a method for modifying a representation based upon a user instruction and a system for producing a modified representation by said method. Conventional drawing systems, such as pen and paper and writing tablets, require a reasonable degree of drawing skill which not all users possess. Additionally, these conventional systems produce static drawings. The method of the invention comprises receiving a representation from a first user, associating the representation with an input object classification, receiving an instruction from a second user, associating the instruction with an animation classification, determining a modification of the representation using the input object classification and the animation classification, and modifying the representation using the modification. When the first user provides a representation of something, for example a character in a story, it is identified to a certain degree by associating it with an object classification. In other words, the best possible match is determined. As the second user imagines a story involving the representation, dynamic elements of the story are exhibited in one or more communication forms such as writing, speech, gestures, facial expressions. By deriving an instruction from these signals, the representation may be modified, or animated, to illustrate the dynamic element in the story. This improves the feedback to the users, and increases the enjoyment of the users.01-27-2011
20110022991TOUCH DETECTING INTERACTIVE DISPLAY BACKGROUND - The invention provides an interactive display that is controlled by user gestures identified on a touch detecting display surface. In the preferred embodiment of the invention, imagery is projected onto a horizontal projection surface from a projector located above the projection surface. The locations where a user contacts the projection surface are detected using a set of infrared emitters and receivers arrayed around the perimeter of the projection surface. For each contact location, a computer software application stores a history of contact position information and, from the position history, determines a velocity for each contact location. Based upon the position history and the velocity information, gestures are identified. The identified gestures are associated with display commands that are executed to update the displayed imagery accordingly. Thus, the invention enables users to control the display through direct physical interaction with the imagery.01-27-2011
20130067421Secondary Actions on a Notification - Various embodiments enable notifications to be generated in both touch and non-touch environments. In at least some embodiments, a notification window is presented and a drag operation can reveal one or more secondary actions that can be performed. In at least some embodiments, selection of one or more of the secondary actions can occur independent of, and without utilizing additional special affordances, such as buttons.03-14-2013
20090241072Unlocking a Device by Performing Gestures on an Unlock Image - A device with a touch-sensitive display may be unlocked via gestures performed on the touch-sensitive display. The device is unlocked if contact with the display corresponds to a predefined gesture for unlocking the device. The device displays one or more unlock images with respect to which the predefined gesture is to be performed in order to unlock the device. The performance of the predefined gesture with respect to the unlock image may include moving the unlock image to a predefined location and/or moving the unlock image along a predefined path. The device may also display visual cues of the predefined gesture on the touch screen to remind a user of the gesture.09-24-2009
20130167093DISPLAY APPARATUS FOR RELEASING LOCKED STATE AND METHOD THEREOF - A method and apparatus are provided for releasing a locked state of a display apparatus. A display unit displays a locked view having a line and an affordance object connected to the line. When a user touches a point on the line in the locked view, a control unit controls the display unit to switch to an unlocked view while separating the affordance object by cutting the line at the point on the line.06-27-2013
20130167094Device, Method, and Graphical User Interface for Selection of Views in a Three-Dimensional Map Based on Gesture Inputs - An electronic device displays a first map view of a map that includes one or more map objects on a touch-sensitive display. While displaying the first map view, the device detects a first gesture of a first gesture type at a first location on the touch-sensitive display. The first location corresponds to a respective map object. In response to detecting the first gesture at the first location, the device enters a map view selection mode. While in the map view selection mode, the device detects a second gesture of a second gesture type at a second location on the touch-sensitive display. The second location corresponds to a respective location on the map. In response to detecting the second gesture at the second location, the device replaces the first map view with a second map view that includes a view of the respective map object from the respective location.06-27-2013
20110035708MULTI-TOUCH WALLPAPER MANAGEMENT - A method and apparatus for multi-touch wallpaper management for a mobile computing device are described wherein a first wallpaper image is displayed on a multi-touch-sensitive display of the mobile computing device and a multi-touch gesture is received indicating a request to change the first wallpaper image. In response to the multi-touch gesture, at least a portion of a second wallpaper image is displayed. Other embodiments are described and claimed.02-10-2011
20110041100Method and Device for Touchless Signing and Recognition - A touchless sensor device (02-17-2011
20110314429APPLICATION PROGRAMMING INTERFACES FOR GESTURE OPERATIONS - At least certain embodiments of the present disclosure include an environment with user interface software interacting with a software application to provide gesture operations for a display of a device. A method for operating through an application programming interface (API) in this environment includes transferring a scaling transform call. The gesture operations include performing a scaling transform such as a zoom in or zoom out in response to a user input having two or more input points. The gesture operations also include performing a rotation transform to rotate an image or view in response to a user input having two or more input points.12-22-2011
20120272194METHODS AND APPARATUSES FOR FACILITATING GESTURE RECOGNITION - Methods and apparatuses are provided for facilitating gesture recognition. A method may include constructing a matrix based at least in part on an input gesture and a template gesture. The method may further include determining whether a relationship determined based at least in part on the constructed matrix satisfies a predefined threshold. In an instance in which the relationship does not satisfy the predefined threshold, the method may also include eliminating the template gesture from further consideration for recognition of the input gesture. In an instance in which the relationship satisfies the predefined threshold, the method may further include determining a rotation matrix based at least in part on the constructed matrix. Corresponding apparatuses are also provided.10-25-2012
20110265045ELECTRONIC SYSTEM AND METHOD FOR OPERATING TOUCH SCREEN THEREOF - The disclosure provides an unlock method of an electronic system with a touch screen. The unlock method includes steps below: receiving a triggering event when the system is locked; activating the touch screen in response to the event; receiving an input gesture; comparing the gesture with a customized gesture; unlocking the system in case that the input gesture is matched to the customized gesture, which is customized by user, and is not a default unlock gesture built in the electronic system.10-27-2011
20110283242REPORT OR APPLICATION SCREEN SEARCHING - Search results may be graphically displayed on a client device as thumbnail images. A search for one or more files in the form of a search term may be received from a client device. The search may be executed based on the search term by searching one or more databases corresponding to applications associated with the client device. One or more files may be identified that satisfy the search term. Metadata associated with the identified files may be processed to generate a thumbnail image of the file based at least in part on the metadata for each of the one or more identified files. The thumbnail images of at least a subset of the identified files may be provided to and displayed on the client device. The associated files may be accessed by the client device.11-17-2011
20120192121BREATH-SENSITIVE DIGITAL INTERFACE - A breath-sensitive digital interface that enables use a person's breath or other fluid for purposes of navigating digital media, and method for using such an interface.07-26-2012
20120192120IMAGE FORMING APPARATUS AND TERMINAL DEVICE EACH HAVING TOUCH PANEL - An image forming apparatus includes a touch panel, a controller connected to the touch panel, and a memory. When a pinch-in gesture on the touch panel is detected during execution of an application, the controller stores, in the memory, information showing a state of processing of the application when the pinch-in gesture is detected, and when a pinch-out gesture on the touch panel is detected, the controller reads the stored information showing the state of processing of the application from the memory, and resumes processing of the application from the state shown by the information.07-26-2012
20120192119USB HID DEVICE ABSTRACTION FOR HDTP USER INTERFACES - A method for implementing USB communications providing user interface measurement and detection of at least one gesture and one angle of finger position for a touch-based user interface is disclosed. The method comprises receiving real-time tactile-image information from a tactile sensor array; processing the tactile-image information to detect and measure the variation of one angle of a finger position and to detect at least one gesture producing at least one of a parameter value responsive to the variation in the finger angle and a symbol responsive to a detected gesture. These are mapped to a Universal Serial Bus (USB) Human Interface Device message which is transmitted to a host device over USB hardware for use by an application executing on the host device. The method provides for the incorporation of various configurations, tactical grammars, use with a touch screen, and numerous other features.07-26-2012
20090144668SENSING APPARATUS AND OPERATING METHOD THEREOF - A sensing apparatus is disclosed. The sensing apparatus comprises a first image capturing module, a second image capturing module, a calculating module, and a controlling module. The first image capturing module and the second image capturing module capture a first image and a second image related to a plurality of objects respectively at a specific time. The calculating module obtains a 3-D position of an object according to the first image and the second image and obtains a 3-D displacement of the object according to the 3-D position and a former 3-D position of the object. If any one of 3-D displacements corresponding to the objects is approximately vertical, the controlling module controls an electrical apparatus to perform a first function. If a weighting average of approximately horizontal vector components of all 3-D displacements meets a condition, the controlling module controls the electrical apparatus to perform a second function.06-04-2009
20110302538SYSTEM AND METHOD FOR DISTINGUISHING MULTIMODAL COMMANDS DIRECTED AT A MACHINE FROM AMBIENT HUMAN COMMUNICATIONS - A method and system of distinguishing multimodal HCI from ambient human interactions using wake up commands is disclosed. In one embodiment, in a method of distinguishing multimodal HCI from ambient human interactions, a wake up command is detected by a computing system. The computing system is then woken up to receive a valid user command from a user upon detecting the wake up command. A countdown timer is substantially simultaneously turned on upon waking up the computing system to receive valid user commands. The countdown timer is set based on application usage parameters such as semantics of the valid user command and context of an application associated with the valid user command.12-08-2011
20110289462Computing Device Magnification Gesture - Computing device magnification gesture techniques are described. In implementations, a first input is recognized as a magnification gesture to initiate magnification of at least a portion of a user interface displayed by a display device of a computing device. The magnified portion is displayed in the user interface as at least partially encompassed by an unmagnified portion of the user interface. A second input is recognized as specifying a modification to be made to data included in the magnified portion of the user interface, the second input recognized as occurring during provision of the first input. Responsive to recognition that the first input is no longer being provided, the display of the magnified portion ceases in the user interface.11-24-2011
20110296357Method For Providing A User Interface Using Three-Dimensional Gestures And An Apparatus Using The Same - Provided is a method capable of making various modifications to widgets, graphic objects, or images, which are displayed on a display device, according to motions of a plurality of input units such as finger or stylus pen, with the use of a three-dimensional multi-sensor configured to detect the motions of the input units in a space, without touching the display device.12-01-2011
20110296356Unlocking a Device by Performing Gestures on an Unlock Image - A device with a touch-sensitive display may be unlocked via gestures performed on the touch-sensitive display. The device is unlocked if contact with the display corresponds to a predefined gesture for unlocking the device. The device displays one or more unlock images with respect to which the predefined gesture is to be performed in order to unlock the device. The performance of the predefined gesture with respect to the unlock image may include moving the unlock image to a predefined location and/or moving the unlock image along a predefined path. The device may also display visual cues of the predefined gesture on the touch screen to remind a user of the gesture.12-01-2011
20110296355Techniques for self adjusting kiosk display information - Techniques for self adjusting kiosk display information are provided. Presentation information is centered within a display of the kiosk. A center location for the presentation information is custom recalibrated within the display based on direction of a user.12-01-2011
20100005427Systems and Methods of Touchless Interaction - A contactless display system enables a user to interact with a displayed image by moving a finger, or pointer, toward a selected portion of the image. Images can be enlarged, or translated dynamically in response to detected movement. Operational methodology can be manually switched between contact-type and contactless operation to enhance flexibility.01-07-2010
20130219345APPARATUS AND ASSOCIATED METHODS - An apparatus including: at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following: in response to user input, disassociate the linking of the orientation of content with respect to the orientation of a display of a portable electronic device to retain the particular orientation of the content with respect to the orientation of the display during the user input.08-22-2013
20110265046THROWING GESTURES FOR MOBILE DEVICES - At least one tilt sensor generates a sensor value. A context information server, receives the sensor value and sets at least one context attribute. An application uses at least one context attribute to determine that a flinging gesture has been made and to change an image on a display in response to the flinging gesture.10-27-2011
20100169843INPUT APPARATUS, HANDHELD APPARATUS, AND CONTROL METHOD - An input apparatus includes: a motion signal output section to detect a movement of an object for controlling a movement of an image displayed on a screen and output a motion signal corresponding to the movement of the object; a control command generation section to generate a control command corresponding to the motion signal for controlling the movement of the image; an operation signal output section to detect a user operation unintended for the control of the movement of the image and output an operation signal based on the operation; an operation command generation section to generate an operation command based on the operation signal; and a control section to control the control command generation section to generate, in temporal relation with a generation timing of the operation command, the control command with a sensitivity of the movement of the image with respect to the movement of the object changed.07-01-2010
20100169842Control Function Gestures - Techniques involving control function gestures are described. In an implementation, a control function is identified in response to gesture input at a touch screen of the remote control device. Execution of the identified control function is initiated a client device that is communicatively coupled to the remote control device and that is configured to alter an output of content by the client device that is broadcast to the client device.07-01-2010
20100169841HANDWRITING MANIPULATION FOR CONDUCTING A SEARCH OVER MULTIPLE DATABASES - A method and system is provided for allowing a user of a multifunctional device to search information without going through several menu/button manipulations. More specifically, a multifunctional device comprises a user screen area including a touch sensitive layer to receive user input in order to initiate a search over several local and/or remote databases. The user is allowed to input a free-style handwriting query on a screen of a device to look up information, such as contact information, available applications, wallpapers, ringtones, photos, call logs, etc. After conducting a search, the device presents the search results to the user in such a way that the user can start an intended operation by selecting a search result, such as making a call, starting an application, etc.07-01-2010
20100169840Method For Recognizing And Tracing Gesture - A method for recognizing and tracing a gesture fetches a gesture image by an image sensor. The gesture image is processed for recognizing and tracing, and a corresponding action is performed according to the processed result. The gesture image is pre-processed and then a moved image is detected. The moved image is analyzed to obtain a gesture feature. When the gesture feature is corresponding to a moved gesture, a center coordinate of the moved gesture is detected and outputted to control a cursor. When the gesture feature is corresponding to a command gesture, a relevant action command is outputted. Therefore, the method provides cursor movement and command input by user gesture.07-01-2010
20100031201PROJECTION OF A USER INTERFACE OF A DEVICE - A method may include projecting, by a device, content on a surface, determining an orientation of the device, detecting a movement of the device, determining an operation that corresponds to the movement and interacts with the content, and performing the operation.02-04-2010
20100146462INFORMATION PROCESSING APPARATUS AND METHOD - An information processing apparatus having a touch-sensitive panel and processing a gesture input performed via the touch-sensitive panel accepts an instruction from a user for transitioning from a first processing state to a second processing state; sets a number of gesture-input-based operations in accordance with the instruction accepted; and executes corresponding processing as a gesture input in the second processing state with regard to gesture inputs of the number of operations set. The information processing apparatus executes corresponding processing as a gesture input in the first processing state with regard to a gesture input after the gesture inputs of the number of operations have been performed.06-10-2010
201001258173D INTERFACE APPARATUS AND INTERFACING METHOD USING THE SAME - A 3D interface apparatus which is operated based on motion and an interfacing method using the same are provided. The interface apparatus includes a motion sensor, a controller which determines a wind property, and a wind generation module which generates a wind. Accordingly, a user is allowed to manipulate a GUI more easily, conveniently, and intuitively.05-20-2010
20120036485Motion Driven User Interface - A motion driven user interface for a mobile device is described which provides a user with the ability to cause execution of user interface input commands by physically moving the mobile device in space. The mobile device uses embedded sensors to identify its motion which causes execution of a corresponding user interface input command. Further, the command to be executed can vary depending upon the operating context of the mobile device.02-09-2012
20100281435SYSTEM AND METHOD FOR MULTIMODAL INTERACTION USING ROBUST GESTURE PROCESSING - Disclosed herein are systems, computer-implemented methods, and tangible computer-readable media for multimodal interaction. The method includes receiving a plurality of multimodal inputs associated with a query, the plurality of multimodal inputs including at least one gesture input, editing the at least one gesture input with a gesture edit machine. The method further includes responding to the query based on the edited gesture input and remaining multimodal inputs. The gesture inputs can be from a stylus, finger, mouse, and other pointing/gesture device. The gesture input can be unexpected or errorful. The gesture edit machine can perform actions such as deletion, substitution, insertion, and aggregation. The gesture edit machine can be modeled as a finite-state transducer. In one aspect, the method further includes generating a lattice for each input, generating an integrated lattice of combined meaning of the generated lattices, and responding to the query further based on the integrated lattice.11-04-2010
20100281436BINDING USERS TO A GESTURE BASED SYSTEM AND PROVIDING FEEDBACK TO THE USERS - Techniques for managing a set of states associated with a capture device are disclosed herein. The capture device may detect and bind to users, and may provide feedback about whether the capture device is bound to, or detecting a user. Techniques are also disclosed wherein virtual ports may be associated with users bound to a capture device and feedback about the state of virtual ports may be provided.11-04-2010
20090094562MENU DISPLAY METHOD FOR A MOBILE COMMUNICATION TERMINAL - A mobile terminal comprising a display module to display a tag and to display a menu screen image related to the tag at one portion of a background image as the tag is dragged, the menu screen image being displayed according to a dragging direction and a dragging distance; an input unit to detect a touch input with respect to the display module or the tag to determine the dragging direction and the dragging distance; and a controller to expose or hide the menu screen image according to the dragging direction the dragging distance of the inputted touch.04-09-2009
20090094560HANDLE FLAGS - The claimed subject matter provides techniques to effectuate and facilitate efficient and flexible selection of display objects. The system can include devices and components that acquire gestures from pointing instrumentalities and thereafter ascertains velocities and proximities in relation to the displayed objects. Based at least upon these ascertained velocities and proximities falling below or within threshold levels, the system displays flags associated with the display object.04-09-2009
20100083189METHOD AND APPARATUS FOR SPATIAL CONTEXT BASED COORDINATION OF INFORMATION AMONG MULTIPLE DEVICES - The invention includes a method and apparatus for coordinating transfer of information between ones of a plurality of devices including a coordinating device and at least one other device. In one embodiment, a method includes detecting selection of an item available at a first one of the devices, detecting a gesture-based command for the selected item, identifying a second one of the devices based on the gesture-based command and a spatial relationship between the coordinating device and the second one of the devices, and initiating a control message adapted for enabling the first one of the devices to propagate the selected item toward the second one of the devices. The control message is adapted for enabling the first one of the devices to propagate the selected item toward the second one of the devices. The first one of the devices on which the item is available may be the coordinating device or another device.04-01-2010
20100088654ELECTRONIC DEVICE HAVING A STATE AWARE TOUCHSCREEN - An electronic device having a touchscreen display. A graphical user interface (GUI) is displayed on the touchscreen display that includes a user interface element displayed in a default state at a location, the user interface element being associated with a function. The user interface element is changed from the default state to a first state upon detecting a first input event at the location. The user interface element is changed from the first state to a second state upon detecting a second input event at the location.04-08-2010
20100088653PORTABLE ELECTRONIC DEVICE AND METHOD OF CONTROLLING SAME - A method of controlling a portable electronic device that has a touch screen display includes rendering a graphical user interface including a plurality of selectable features, detecting a first touch event at a first location on the touch screen display, detecting a second touch event at a second location on the touch screen display during the first touch event, and selecting ones of the plurality of selectable features located in an area having boundaries defined by the first location and the second location to define a group of selected features.04-08-2010
20090307634User Interface, Device and Method for Displaying a Stable Screen View - A user interface configured to display a screen view representing an application and to receive motion data representing a detected movement, said user interface being further configured to update said displayed screen view to visually counteract said detected movement.12-10-2009
20100100855HANDHELD TERMINAL AND METHOD FOR CONTROLLING THE HANDHELD TERMINAL USING TOUCH INPUT - A handheld terminal includes a coordinate recognizer to recognize a first coordinate on a screen where a touch starts and to recognize a second coordinate on the screen where the touch ends, a function identifier to identify a function corresponding to the pair of coordinates, and a function performer to perform the identified function. The first and second coordinates may respectively correspond to a service icon displayed at or near the first coordinate and a process area displayed at or near the second coordinate and associated with the identified function. A method for controlling a handheld terminal includes recognizing a first coordinate on a screen where a touch starts and a second coordinate on the screen where the touch ends, identifying a function corresponding to the first coordinate and the second coordinate, and performing the identified function.04-22-2010
20100083191METHOD AND APPARATUS FOR DISPLAYING CONTENT AT A MOBILE DEVICE - The invention relates to displaying content at a mobile device. In some embodiments, a method and apparatus are arranged to display content at a mobile device in association with a graphical device indicating a method of user interaction associated with the content. The mobile device is arranged to detect, at a detector such as a motion detector or on a touch screen of the mobile device, a user interaction and to determine whether the user interaction corresponds to the indicated method of interaction. The method and apparatus are arranged to perform an action relating to the content in response to the detection of the indicated method of interaction.04-01-2010
20100083188COMPUTER USER INTERFACE SYSTEM AND METHODS - Systems and methods may provide user control of a computer system via one or more sensors. Also, systems and methods may provide automated response of a computer system to information acquired via one or more sensors. The sensor(s) may be configured to measure distance, depth proximity and/or presence. In particular, the sensor(s) may be configured to measure a relative location, distance, presence, movements and/or gestures of one or more users of the computer system. Thus, the systems and methods may provide a computer user interface based on measurements of distance, depth, proximity, presence and/or movements by one or more sensors. For example, various contexts and/or operations of the computer system, at the operating system level and/or the application level, may be controlled, automatically and/or at a user's direction, based on information acquired by the sensor(s).04-01-2010
20100083190TOUCH GESTURE INTERFACE APPARATUSES, SYSTEMS, AND METHODS - In certain embodiments, an object touch is detected on a touch screen display, a touch gesture interface is displayed on the touch screen display in response to the object touch, a touch gesture is detected on the touch screen display, and an action is performed based on the touch gesture. In certain embodiments, the touch gesture includes a directional touch gesture in a direction away from a position of the object touch on a surface of the touch screen display. In certain embodiments, the touch gesture interface includes a plurality of selectable options, and the action includes one of navigating through the selectable options and selecting one of the selectable options.04-01-2010
20090089716Automatic communication notification and answering method in communication correspondance - A communication correspondence notification and reply method is provided. The method is implemented as a software program with the objective to be less distractive and to increase work productivity compared to prior methods. In particular, a notification format for incoming communication correspondences is determined, without any guidance/input from the user, taking into account (i) monitored user activity and (ii) the type of incoming correspondence, i.e. the notification format is a function of tracked/monitored user activity and the message type with the objective to minimize distraction to the user. To further minimize user distraction, the software program determines an area on the display of the computer system where the incoming correspondence can be presented to the user. Once presented, the user then has the ability to reply with minimal effort by making a pointer-device gesture movement in reply to the presented notification.04-02-2009
20090089717MOBILE TERMINAL AND METHOD OF CONTROLLING THE MOBILE TERMINAL - A method of controlling a mobile terminal and which includes displaying a first screen image on a touch screen of the mobile terminal as an idle background screen, receiving a touch and drag input operation being performed on the touch screen including the displayed first screen image, and displaying a second screen image corresponding to a direction of the touch and drag input operation, said second screen image corresponding to a new idle background screen.04-02-2009
20090276734Projection of Images onto Tangible User Interfaces - A surface computing device is described which has a surface which can be switched between transparent and diffuse states. When the surface is in its diffuse state, an image can be projected onto the surface and when the surface is in its transparent state, an image can be projected through the surface and onto an object. In an embodiment, the image projected onto the object is redirected onto a different face of the object, so as to provide an additional display surface or to augment the appearance of the object. In another embodiment, the image may be redirected onto another object.11-05-2009
20090288044ACCESSING A MENU UTILIZING A DRAG-OPERATION - Computer-readable media, computerized methods, and computer systems for intuitively invoking a presentation action (e.g., rendering a menu) by applying a drag-operation at a top-level control button rendered at a touchscreen display are provided. Initially, aspects of a user-initiated input applied at the top-level control button are detected. These aspects may include an actuation location and a distance of a drag-movement therefrom. If a distance of the drag-movement at the touchscreen display is greater than a threshold distance in a particular radial direction from the actuation location, the user-initiated input is considered a drag-operation. Typically, a set of trigger boundaries are constructed based on system metrics to assist in disambiguating the drag-operation from a tap-type operation. If a drag-operation is identified, the presentation action is invoked; otherwise, a principle action associated with the top-level control button (e.g., manipulating content of an application) may be invoked.11-19-2009
20110173576USER INTERFACE FOR AUGMENTED REALITY - The disclosed embodiments are directed to a method, an apparatus, and a user interface. The disclosed embodiments include acquiring an image, identifying one or more objects of interest in the image, and providing an indication that additional information is available for the one or more of the objects of interest without obscuring the one or more objects of interest.07-14-2011
20110173575METHOD AND DEVICE FOR INPUTTING TEXTS - There is disclosed a method for the detection of the selection of a character of a character string to be input from a character set on an input surface, wherein the selection of at least one character of the character string is detected by evaluating a direction vector of a gesture which is input on the input surface. There is also disclosed an input device for carrying out the method, especially a mobile terminal with a touch-sensitive input surface for selecting characters of a character string to be input, in which on the touch-sensitive input surface an input pattern with a number of characters from a character set can be displayed, whereby the input device comprises an evaluation unit, which detects the selection of at least one character of the character string by evaluating a direction vector of a gesture input with an input medium on the input surface.07-14-2011
20110173574IN APPLICATION GESTURE INTERPRETATION - In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. A gesture-based system may have a plurality of modes, each mode a hardware configuration, a software configuration, or a combination thereof. Techniques for transitioning a user's control, via the user's gestures, between different modes enables a system to coordinate controls between multiple modes. For example, while a first mode is active, the user's gestures may control aspects of the first mode. The system may transition the user's control from a control of the first mode to a control of a second mode. The transition may be between hardware, software, or a combination thereof. In another embodiment, reserved gestures that correspond to a first mode that may be executed whether or not a second mode is present.07-14-2011
20110209104MULTI-SCREEN SYNCHRONOUS SLIDE GESTURE - Embodiments of a multi-screen synchronous slide gesture are described. In various embodiments, a first motion input is recognized at a first screen of a multi-screen system, and the first motion input is recognized when moving in a particular direction across the first screen. A second motion input is recognized at a second screen of the multi-screen system, where the second motion input is recognized when moving in the particular direction across the second screen and approximately when the first motion input is recognized. A synchronous slide gesture can then be determined from the recognized first and second motion inputs.08-25-2011
20110209103MULTI-SCREEN HOLD AND DRAG GESTURE - Embodiments of a multi-screen hold and drag gesture are described. In various embodiments, a hold input is recognized at a first screen of a multi-screen system when the hold input is held in place. A motion input is recognized at a second screen of the multi-screen system, and the motion input is recognized to select a displayed object while the hold input remains held in place. A hold and drag gesture can then be determined from the recognized hold and motion inputs.08-25-2011
20110209102MULTI-SCREEN DUAL TAP GESTURE - Embodiments of a multi-screen dual tap gesture are described. In various embodiments, a first tap input to a displayed object is recognized at a first screen of a multi-screen system. A second tap input to the displayed object is recognized at a second screen of the multi-screen system, and the second tap input is recognized approximately when the first tap input is recognized. A dual tap gesture can then be determined from the recognized first and second tap inputs.08-25-2011
20110209101MULTI-SCREEN PINCH-TO-POCKET GESTURE - Embodiments of a multi-screen pinch-to-pocket gesture are described. In various embodiments, a first motion input to a first screen region is recognized at a first screen of a multi-screen system, and the first motion input is recognized to select a displayed object. A second motion input to a second screen region is recognized at a second screen of the multi-screen system, and the second motion input is recognized to select the displayed object. A pinch-to-pocket gesture can then be determined from the recognized first and second motion inputs within the respective first and second screen regions, the pinch-to-pocket gesture effective to pocket the displayed object.08-25-2011
20110209099Page Manipulations Using On and Off-Screen Gestures - Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.08-25-2011
20110209098On and Off-Screen Gesture Combinations - Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.08-25-2011
20110209097Use of Bezel as an Input Mechanism - Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.08-25-2011
20100131905Methods, Systems, and Products for Gesture-Activation - Methods, systems, and products are disclosed for operating home appliances using gesture recognition. A sequence of video images is received and compared to a stored sequence of gesture images. A gesture image is associated to an operation of an appliance.05-27-2010
20100005428INFORMATION PROCESSING APPARATUS AND METHOD FOR DISPLAYING AUXILIARY INFORMATION - There is provided an information processing apparatus, including a direction detection unit that detects a drawing direction of a locus drawn in an input process of a gesture when the gesture is input, a gesture search unit that searches for the gesture matching the drawing direction of the locus detected by the direction detection unit from among a plurality of predetermined gestures, and an auxiliary information display unit that displays a search result by the gesture search unit in a screen as auxiliary information each time the drawing direction of the locus is detected by the direction detection unit.01-07-2010
20100146458Operating System Providing Multi-Touch Support For Applications In A Mobile Device - An operating system providing multi-touch support for (user) applications in a mobile device. In one embodiment, a check of whether the touch screen (in the mobile device) has multi-touch capability is performed. A first interface with multi-touch capability is provided to the (user) applications if the touch screen has multi-touch capability and a second interface with single touch capability being provided if the touch screen does not have multi-touch capability. The first and second interfaces may be provided by corresponding device drivers loaded when the mobile device is initialized with the operating system. A device driver (providing the second interface) is also designed to perform the check and execute another device driver (providing the first interface) if the touch screen has multi-touch capability.06-10-2010
20100146459Apparatus and Method for Influencing Application Window Functionality Based on Characteristics of Touch Initiated User Interface Manipulations - Apparatuses and methods for presenting an application window on a touch sensitive screen of a mobile device, the application window configured to facilitate user interaction with an application and with a plurality of touch activatable items displayable in a predetermined manner within the application window. A first long tap is detected having a first predetermined duration within the application window and invoking, in response to detecting the first long tap, a first mode of the application window that enables a first type of behavior of one or more of the touch activatable items. At a touch sensitive screen location other than a location within the application window, a second long tap having a second predetermined duration is detected and, in response to the second long tap, a second mode is invoked that influences behavior of one or more of the touch activatable items within the application window.06-10-2010
20120297347GESTURE-BASED NAVIGATION CONTROL - A user interface may be provided by: displaying a graphical user interface including at least one graphical user interface element; receiving at least one gesture-based user input; displaying a graphical user interface including the at least one graphical user interface element and one or more graphical user interface elements that are hierarchically dependent from the at least one graphical user interface element in response to the at least one gesture-based user input.11-22-2012
20080244466SYSTEM AND METHOD FOR INTERFACING WITH INFORMATION ON A DISPLAY SCREEN - A technique for interfacing with graphical information on a display screen involves using a hand-held controller unit to collect image information that includes at least a portion of the display screen and using the image of the display screen to generate position data that is indicative of the position of the hand-held controller unit relative to the display screen. An action in a computer program related to the graphical information is then triggered in response to the position data and in response to a user input at the hand-held controller unit. Using this technique, a user can navigate a graphical user interface on a display screen with a hand-held controller unit without relying on beacon-based navigation.10-02-2008
20080244465COMMAND INPUT BY HAND GESTURES CAPTURED FROM CAMERA - A method and system for invoking an operation of a communication terminal in response to registering and interpreting a predetermined motion or pattern of an object. An input is received, the image data of the object is captured and the object in the image data is identified.10-02-2008
20080276203USER INTERFACE AND COOKING OVEN PROVIDED WITH SUCH USER INTERFACE - An user interface for domestic appliances, particularly for cooking ovens, comprises input and display for showing menus and/or items selected by the user through said input. The input comprises a selection zone where the user's finger can move, the display having at least a portion with a shape substantially corresponding to the shape of the selection zone and showing the result of the finger movement in terms of item or menu selection.11-06-2008
20090265671MOBILE DEVICES WITH MOTION GESTURE RECOGNITION - Mobile devices using motion gesture recognition. In one aspect, processing motion to control a portable electronic device includes receiving, on the device, sensed motion data derived from motion sensors of the device and based on device movement in space. The motion sensors include at least three rotational motion sensors and at least three accelerometers. A particular operating mode is determined to be active while the movement of the device occurs, the mode being one of multiple different operating modes of the device. Motion gesture(s) are recognized from the motion data from a set of motion gestures available for recognition in the active operating mode. Each of the different operating modes, when active, has a different set of gestures available. State(s) of the device are changed based on the recognized gestures, including changing output of a display screen on the device.10-22-2009
20090265670USER INTERFACE FOR A MOBILE DEVICE USING A USER'S GESTURE IN THE PROXIMITY OF AN ELECTRONIC DEVICE - An electronic device having a user interface on a display and method for controlling the device, the method including: detecting a proximity of an object to the display; detecting a two-dimensional motion pattern of the object; and controlling the user interface according to the detected two-dimensional motion pattern. Also, a method including: detecting an object in a space over a border between first and second zones of a plurality of touch-sensitive zones and outputting a detection signal; and simultaneously displaying first and second information elements corresponding to the first and second zones in response to the detection signal.10-22-2009
20090265669LANGUAGE INPUT INTERFACE ON A DEVICE - Methods, systems, devices, and apparatus, including computer program products, for inputting text. A user interface element is presented on a touch-sensitive display of a device. The user interface element is associated with a plurality of characters, at least a subset of which is associated with respective gestures. A user input performing a gesture with respect to the user interface element is received. The character from the subset that is associated with the gesture performed with respect to the user interface element is inputted.10-22-2009
20080282203GENERATING VECTOR GEOMETRY FROM RASTER INPUT FOR SEMI-AUTOMATIC LAND PLANNING - One embodiment of the invention includes a land planning tool that maybe used to perform a variety of land planning tasks. The land planning tool may interpret global information systems (GIS) electronic data in conjunction with user-specified constraints to analyze and display a development site, visually indicating developable areas. The user may then use a pen-based device to sketch outlines of land planning objects. As the user sketches, the land planning tool may generate vector geometry stored in an electronic database for use by a variety of computer aided design tools.11-13-2008
20080282202GESTURED MOVEMENT OF OBJECT TO DISPLAY EDGE - The use of gestures to organize displayed objects on an interactive display. The gesture is used to move the displayed object to the edge of the interactive display so that the displayed object is only partially displayed after being moved. The size of the displayed object may be reduced and/or the displayed object may be rotated such that an identified portion of the displayed object remains in the display after moving. A gesture may also be used to move multiple displayed objects to the edge of the display.11-13-2008
20080288895Touch-Down Feed-Forward in 30D Touch Interaction - A 3-D display device in which zooming is controlled based on the distance that a user's finger is from the display screen, generates a virtual drop shadow of the user's finger at the detected X/Y position of the user's finger with respect to the display screen. The virtual drop shadow represents the center of the zooming of the display image. In addition the size and darkness of the drop shadow is changed relative to the distance that the user's finger is from the display screen.11-20-2008
20120297348CONTROL OF A DEVICE USING GESTURES - In an operating system running on a processing device, detecting a gesture input via a user interface; identifying an operating system operation that corresponds to the gesture; performing the operating system operation; identifying an application running on the operating system that has subscribed to gesture input; and passing data corresponding to the gesture to the application for use by the application.11-22-2012
20120060127Automatic orientation of items on a touch screen display utilizing hand direction - Method or orienting items on the display to a user are based on the direction of the user hand, or the user are disclosed. The method relies on detection of the direction of the users' hand and orienting the item at a selected orientation thereto. An aspect of the invention also includes methods of detecting users' position about a touch screen display03-08-2012
20080270950METHOD AND APPARATUS FOR IMPORTING DATA FROM AN APPLICATION INTO A SECOND APPLICATION - One embodiment of the present invention provides a system that automatically acquires data from an application and imports the data into a second application. During operation, the system receives at a data-acquisition tool a command from a user to acquire data from the application. In response to the command, the system overlays a semi-transparent layer over at least a portion of a display which is generated by the application, so that the data within the display is still visible to the user. Next, the system receives a drawing command from the user to draw a shape around an item of data within the display. In response to the drawing command, the system draws a shape around the item of data within the display, wherein the shape is drawn on the semi-transparent layer. The system then acquires the item of data bounded by the shape.10-30-2008
20100146464Architecture For Controlling A Computer Using Hand Gestures - Architecture for implementing a perceptual user interface. The architecture comprises alternative modalities for controlling computer application programs and manipulating on-screen objects through hand gestures or a combination of hand gestures and verbal commands. The perceptual user interface system includes a tracking component that detects object characteristics of at least one of a plurality of objects within a scene, and tracks the respective object. Detection of object characteristics is based at least in part upon image comparison of a plurality of images relative to a course mapping of the images. A seeding component iteratively seeds the tracking component with object hypotheses based upon the presence of the object characteristics and the image comparison. A filtering component selectively removes the tracked object from the object hypotheses and/or at least one object hypothesis from the set of object hypotheses based upon predetermined removal criteria.06-10-2010
20100146461ELECTRONIC APPARATUS AND DISPLAYING METHOD THEREOF - An electronic apparatus and a displaying method thereof are provided. The electronic apparatus includes a sensor which senses user information, and a controller which reads out item information based on the sensed user information, determines a display method of item content corresponding to the read-out item information, and controls the item content to be displayed in the determined display method. Accordingly, actual goods are displayed along with information regarding the actual goods so that the user can realize a displayed image as actual goods and can easily obtain goods information.06-10-2010
20120198392ASSOCIATING DEVICES IN A MEDICAL ENVIRONMENT - A medical device includes a gesture detector for detecting a gesture of a second device with respect to the medical device. The gesture is detected within a small time window. The medical device also includes an association gesture determiner for determining that the gesture is an association gesture for initiating a request to associate the medical device with the second device, and a device associator for associating the medical device with the second device based on the association gesture.08-02-2012
20110271236DISPLAYING CONTENT ON A DISPLAY DEVICE11-03-2011
20100281440Detecting, Representing, and Interpreting Three-Space Input: Gestural Continuum Subsuming Freespace, Proximal, and Surface-Contact Modes - Systems and methods for detecting, representing, and interpreting three-space input are described. Embodiments of the system, in the context of an SOE, process low-level data from a plurality of sources of spatial tracking data and analyze these semantically uncorrelated spatiotemporal data and generate high-level gestural events according to dynamically configurable implicit and explicit gesture descriptions. The events produced are suitable for consumption by interactive systems, and the embodiments provide one or more mechanisms for controlling and effecting event distribution to these consumers. The embodiments further provide to the consumers of its events a facility for transforming gestural events among arbitrary spatial and semantic frames of reference.11-04-2010
20090164951Input architecture for devices with small input areas and executing multiple applications - A run time environment (e.g., operating system, device drivers, etc.) which translates a touch gesture representing one or more directions on a touch screen to a corresponding choice and indicates the same to a user application. As the choice depends merely on the direction(s) of movement of the touch, choices can be easily indicated for all applications executing in a device with small input areas.06-25-2009
20090138830Method, article, apparatus and computer system for inputting a graphical object - Method, article, apparatus and computer system facilitating the easy and intuitive inputting of a desired graphical object into an electronic system from a large plurality of predetermined graphical objects. In one example embodiment, this is achieved by assigning each of said graphical objects into one of a plurality of groups in accordance with a predetermined similarity criterion, associating respective base shapes to each of said groups, wherein said base shapes having a certain degree of similarity to the objects assigned to the associated group according to said similarity criterion and associating in each of said groups at least one gesture to each of said graphical objects, so that the associated gestures are distinguishable from each other. In order to input the desired graphical object, one of the groups is selected by selecting its base shape and then the desired graphical object is identified by drawing the respective gesture associated thereto.05-28-2009
20090164952CONTROLLING AN OBJECT WITHIN AN ENVIRONMENT USING A POINTING DEVICE - The present invention is directed toward a system and process that controls a group of networked electronic components using a multimodal integration scheme in which inputs from a speech recognition subsystem, gesture recognition subsystem employing a wireless pointing device and pointing analysis subsystem also employing the pointing device, are combined to determine what component a user wants to control and what control action is desired. In this multimodal integration scheme, the desired action concerning an electronic component is decomposed into a command and a referent pair. The referent can be identified using the pointing device to identify the component by pointing at the component or an object associated with it, by using speech recognition, or both. The command may be specified by pressing a button on the pointing device, by a gesture performed with the pointing device, by a speech recognition event, or by any combination of these inputs.06-25-2009
20090144667Apparatus, method, computer program and user interface for enabling user input - An apparatus including a display for presenting text; a touch sensitive input device configured to enable a user to make a trace input via the display; and a processor, wherein the processor is configured to detect a first trace input that starts at a predetermined first location and extends across the touch sensitive input device to a second location wherein the processor is configured such that the detection of the first trace input actuates the deletion of the text presented between the predetermined first location and the second location.06-04-2009
20090178011GESTURE MOVIES - The display of gesture movies is disclosed to assist users in performing gestures. Gesture movies can be short, unintrusive, and available on demand. A list box can appear in a pop-up window or preference panel, containing a list of gestures that can be displayed. If a user clicks on a gesture in the list, a video, movie or animation of the gesture being performed appears in one box, and a video, movie or animation of the action being performed on a particular object is displayed in another box. Thus, a hand can be shown performing the selected gesture over a touch sensor panel, while at the same time, and synchronized with the gesture being displayed, an object being manipulated by the gesture is displayed.07-09-2009
20090178010Specifying Language and Other Preferences for Mobile Device Applications - A user interface for specifying a preference for content is displayed over the content on a display of a mobile device. Preferences (e.g., language preferences) can be specified for audio, closed captions, subtitles and any other features or operations associated with the mobile device. In one aspect, the user interface is a partially transparent sheet that at least partially overlies the content. The sheet can be navigated (e.g., scrolled) in response to input (e.g., touch input). In one aspect, the specified option is made a default option for at least some other applications running on the mobile device. In one aspect, the content is video which is automatically paused while the user interface is displayed.07-09-2009
20090007025USER-INTERFACE FEATURES FOR COMPUTERS WITH CONTACT-SENSITIVE DISPLAYS - Embodiments described herein provide for a portable computer with a contact-sensitive display having a user-interface that is configurable through user-contact with the display. An active input area may be provided that is configurable in appearance and functionality. The contents of the active input area, its functionality, and the manner in which it is oriented, particularly with respect to a left or right handedness, are described herein.01-01-2009
20090024965GRAPHICAL METHOD OF SEMANTIC ORIENTED MODEL ANALYSIS AND TRANSFORMATION DESIGN - User is given the new modeling capability, a Semantic Lasso, which allows grouping of the model elements, so they can be mapped or transformed to the high-level concepts of business ontology. Such grouping is done by drawing of the line contour around relevant to advertised high-level concept model elements on one of the OMG Unified Modeling Language (UML) class diagrams. In addition the user is given a capability to specify extension points of the high-level concepts and the projection of such extension points to the individual model. Another tooling capability, a Semantic Transformation Lens, allows dynamic graphical projection of the individual model fragments to the high-level concepts as those elements are being selected. Semantic Transformation Lens provides the mechanism of reasoning-based smart selection.01-22-2009
20110145768Device, Method, and Grahpical User Interface for Managing User Interface Content and User Interface Elements - Computing devices and methods for managing user interface content and user interface elements are disclosed. In one embodiment, after a plurality of user interface elements is selected from an ordered list, wherein a selection order is maintained for the selected plurality of user interface elements: a user gesture to perform an operation on the plurality of user interface elements is detected, and in response, a stack of temporarily displayed thumbnails corresponding to the selected plurality of user interface elements is displayed, wherein a display order of the stack of temporarily displayed thumbnails corresponds to the selection order of the selected plurality of user interface elements.06-16-2011
20120079435INTERACTIVE PRESENTAION CONTROL SYSTEM - An interactive presentation control system includes a computer, a projection screen, a computer-controlled projector, and a presentation control device. The projector projects images generated by the computer onto the projection screen. The presentation control device includes a storage unit, an image capturing unit, and a processing unit. The storage unit stores a control table recording relationship between gesture shadow patterns and control commands Each gesture shadow pattern corresponds to one control command. The image capturing unit captures an image of the projection screen periodically. The processing unit processes the captured image to determine whether the captured image includes a gesture shadow pattern, and determine the corresponding control command if the determined gesture shadow pattern is recorded in the control table. The processing unit further generates a control signal corresponding to the determined control command, and transmits the control signal to the computer through the communication unit.03-29-2012
20080263479Touchless Manipulation of an Image - The invention relates to a method of providing touchless manipulation of an image through a touchless input device (10-23-2008
20090100383PREDICTIVE GESTURING IN GRAPHICAL USER INTERFACE - A computing system. The computing system includes a display presenting a user interface, and a gesture input configured to translate a user gesture into a command for controlling the computing system. The computing system also includes a gesture-predicting engine to predict a plurality of possible commands based on the beginning of the user gesture, and a rendering engine to indicate the plurality of possible commands via the user interface.04-16-2009
20090100384VARIABLE DEVICE GRAPHICAL USER INTERFACE - Methods, systems, devices, and apparatus, including computer program products, for adjusting a graphical user interface. A motion of a device is detected. A graphical user interface of the device is adjusted in response to the detected motion.04-16-2009
20090064055Application Menu User Interface - Methods, systems, and apparatus, including computer program products, for presenting user interface elements. A first page of one or more user interface elements is presented on a touch-sensitive display. Each of the user interface elements corresponds to a respective application. A gesture performed on the touch-sensitive display is detected. In response to the detected gesture, a second page of one or more user interface elements is presented on the touch-sensitive display.03-05-2009
20110225553Use Of Standalone Mobile Devices To Extend HID Capabilities Of Computer Systems - A mobile device is adapted so that its HID functionality may be used to control an associated computer GUI. The computer may also be used to extend the HID capabilities of the mobile device.09-15-2011
20120144348Managing Virtual Ports - Techniques for managing virtual ports are disclosed herein. Each such virtual port may have different associated features such as, for example, privileges, rights or options. When one or more users are in a capture scene of a gesture based system, the system may associate virtual ports with the users and maintain the virtual ports. Also provided are techniques for disassociating virtual ports with users or swapping virtual ports between two or more users.06-07-2012
20120144346MOTION-BASED USER INTERFACE FEATURE SUBSETS - The present disclosure relates to motion adaptive user equipment (UE) in a wireless communications network environment adapted for selecting one or more subsets of accessible user interface (UI) functions based, at least in part, on the determined motion of a UE. By selectively employing the one or more subsets of UI functions, a UE can dynamically adapt to the motion of the UE and limit distracting interactions with the UE creating a safer or more compliant wireless environment. Further disclosed are features related to override of UI limitations, auxiliary location sensing aspects, motion rule updating features, and voluntary and involuntary user preference aspects.06-07-2012
20120144345Methods and Systems for Radial Input Gestures - A computerized device can comprise a touch-enabled surface and a data processing hardware element configured by a gesture input engine to recognize an input gesture using data from the touch-enabled surface. A parameter value can be set based on determining a path traversed by the input gesture. The data processing element can comprise a processor and the gesture input engine can comprise program logic in a memory device and/or the engine may be implemented using hardware logic. Regardless, the radial input gesture can be used to set one or more parameter values without use of a direct mapping of interface coordinates to parameter values. A method can comprise tracking a plurality of input points, identifying a path defined by the plurality of input points, identifying a closed curve including the path, determining a percentage of the curve traversed by the path, and setting a parameter value based on the percentage.06-07-2012
20090249258Simple Motion Based Input System - One embodiment a programmable device embodying a program of executable instructions to perform steps including assigning multiple tasks or symbols to each of a number of motion groups; segmenting motion data from sensor(s); matching the segments to motion groups; composing and then selecting task(s) or symbol sequence(s) from the task(s) and/or symbol(s) assigned to the matched motion groups.10-01-2009
20090254869MULTI-PARAMETER EXTRACTION ALGORITHMS FOR TACTILE IMAGES FROM USER INTERFACE TACTILE SENSOR ARRAYS - A user interface employs a tactile sensor array producing a rich flux of independently-adjustable interactive control parameters, rates of change, and symbols derived from these as well as tactile shapes, patterns, gestures, syntaxes, and phrases from each of one or more regions of contact or proximity. The tactile sensor array may comprise a pressure sensor array, proximity sensor array, or other sensor such as a video camera. The user interface derives up to six independently-adjustable interactive real-time control parameters plus rates and symbols from a single finger tip. Simple running sums employed during scans so that individual sensor measurements need not be stored. The user interface supports multiple regions of contact or proximity wherein at least one of the regions has a non-convex shape. In addition, the tactile sensor array may be partitioned into sections or modules each with a separate scanning loop and/or processor.10-08-2009
20130219346Input to Locked Computing Device - The subject matter of this specification can be embodied in, among other things, a method that includes receiving at a computing device that is in a locked state, one or more user inputs to unlock the device and to execute at least one command that is different from a command for unlocking the device. The method further includes executing in response to the user inputs to unlock the device an unlocking operation by the device to convert the device from a locked state to an unlocked state. The method further includes executing the at least one command in response to receiving the user inputs to execute the at least one command. The at least one command executes so that results of executing the at least one command are first displayed on the device to a user automatically after the device changes from the locked state to the unlocked state.08-22-2013
20090254868TRANSLATION OF GESTURE RESPONSES IN A VIRTUAL WORLD - Translating gestures made by one avatar to a second avatar in a virtual world by receiving an input from a first user representing an input gesture to be made by the first avatar to the second avatar. The input gesture is translated to generate at least one translated gesture for display. The translated gesture may be output for display as being made by the first avatar to the second avatar.10-08-2009
20080313575SYSTEM AND PROCESS FOR CONTROLLING ELECTRONIC COMPONENTS IN A UBIQUITOUS COMPUTING ENVIRONMENT USING MULTIMODAL INTEGRATION - The present invention is directed toward a system and process that controls a group of networked electronic components using a multimodal integration scheme in which inputs from a speech recognition subsystem, gesture recognition subsystem employing a wireless pointing device and pointing analysis subsystem also employing the pointing device, are combined to determine what component a user wants to control and what control action is desired. In this multimodal integration scheme, the desired action concerning an electronic component is decomposed into a command and a referent pair. The referent can be identified using the pointing device to identify the component by pointing at the component or an object associated with it, by using speech recognition, or both. The command may be specified by pressing a button on the pointing device, by a gesture performed with the pointing device, by a speech recognition event, or by any combination of these inputs.12-18-2008
20100162179Method and Apparatus for Adding or Deleting at Least One Item Based at Least in Part on a Movement - In accordance with an example embodiment of the present invention, an apparatus comprising a user interface configured to detect a first touch, detect a second touch, and detect a movement from the first touch or the second touch. The apparatus further comprises a processor configured to delete or add at least one item based at least in part on the movement.06-24-2010
20100162178Apparatus, method, computer program and user interface for enabling user input - An apparatus, method, computer program and user interface the apparatus including: a display configured to display a first item;06-24-2010
20100162181Interpreting Gesture Input Including Introduction Or Removal Of A Point Of Contact While A Gesture Is In Progress - A touch-sensitive device accepts single-touch and multi-touch input representing gestures, and is able to changing a parameter of a gesture responsive to introduction or removal of a point of contact while the gesture is in progress. The operation associated with the gesture, such as a manipulation of an on-screen object, changes in a predictable manner if the user introduces or removes a contact point while the gesture is in progress. The overall nature of the operation being performed does not change, but a parameter of the operation can change. In various embodiments, each time a contact point is added or removed, the system and method of the present invention resets the relationship between the contact point locations and the operation being performed, in such a manner as to avoid or minimize discontinuities in the operation. In this manner, the invention avoids sudden or unpredictable changes to an object being manipulated.06-24-2010
20100162177INTERACTIVE ENTERTAINMENT SYSTEM AND METHOD OF OPERATION THEREOF - An interactive entertainment system comprises a plurality of devices providing an ambient environment, gesture detection means for detecting a gesture of a user, and control means for receiving an output from the gesture detection means and for communicating with at least one device. The control means is arranged to derive from the output a location in the ambient environment and to change the operation of one or more devices in the determined location, according to the output of the gesture detection means.06-24-2010
20100162182METHOD AND APPARATUS FOR UNLOCKING ELECTRONIC APPLIANCE - An unlocking method and apparatus for an electronic appliance are disclosed. The method and apparatus may enable a user to unlock the electronic appliance by identifying a gesture and to invoke a function mapped to the gesture. The unlocking method includes detecting a preset gesture input when an input means is locked. The method includes unlocking the input means in response to the input gesture. The method also includes invoking an application mapped to the input gesture in response to unlocking.06-24-2010
20100192109Detecting and Interpreting Real-World and Security Gestures on Touch and Hover Sensitive Devices - “Real-world” gestures such as hand or finger movements/orientations that are generally recognized to mean certain things (e.g., an “OK” hand signal generally indicates an affirmative response) can be interpreted by a touch or hover sensitive device to more efficiently and accurately effect intended operations. These gestures can include, but are not limited to, “OK gestures,” “grasp everything gestures,” “stamp of approval gestures,” “circle select gestures,” “X to delete gestures,” “knock to inquire gestures,” “hitchhiker directional gestures,” and “shape gestures.” In addition, gestures can be used to provide identification and allow or deny access to applications, files, and the like.07-29-2010
20100192108METHOD FOR RECOGNIZING GESTURES ON LIQUID CRYSTAL DISPLAY APPARATUS WITH TOUCH INPUT FUNCTION - One aspect of the present invention discloses a method for detecting gestures on a liquid crystal display apparatus with touch input functions is disclosed, wherein the liquid crystal display apparatus includes a display region and a button region. The method comprises the steps of checking whether an object touches a first region of the liquid crystal display apparatus, checking whether the object slides into a second region of the liquid crystal display apparatus after touching the first region, and sending a gesture signal to perform a predetermined function if the object slides between the display region and button region.07-29-2010
20100153890Method, Apparatus and Computer Program Product for Providing a Predictive Model for Drawing Using Touch Screen Devices - An apparatus for providing a predictive model for use with touch screen devices may include a processor. The processor may be configured to identify a stroke event received at a touch screen display, evaluate an environmental parameter corresponding to the touch screen display to determine a scenario based on the environmental parameter, and generate a graphic output corresponding to the identified stroke event for the scenario determined. A corresponding method and computer program product are also provided.06-17-2010
20100180237FUNCTIONALITY SWITCHING IN POINTER INPUT DEVICES - Various embodiments for switching functionality of a graphical user interface (GUI) pointer input device are provided. A first gesture pattern is configured. The first gesture pattern, when performed, enables a predetermined function of the input device. The predetermined function substitutes for a default function of the input device. The enabling of the predetermined function is indicated to a user on the GUI. A second gesture pattern is configured. The second gesture pattern, when performed, cancels the predetermined function of the input device and enables the default function.07-15-2010
20100162180GESTURE-BASED NAVIGATION - A method includes detecting an area of a touch screen that is touched by an instrument and determining a gesture corresponding to the area touched. The method further includes performing a crossover operation when it is determined that the gesture corresponds to a crossover gesture and displaying on the touch screen a content that includes a first child content and a second child content that is associated with the crossover operation, where the crossover operation includes navigating between the first child content to the second child content in response to the crossover gesture. The first child content is accessible via a first parent content and the second child content is accessible via a second parent content, and when navigating between the first child content to the second child content, the first parent content and the second parent content is not displayed.06-24-2010
20100218144Method and Apparatus for Displaying Additional Information Items - An apparatus, that may include a processor configured to receive a touch input associated with a first information item, determine a first set of at least one additional information item associated with said first information item, based at least in part on said touch input, generate a first visual representation based at least in part on said first set, and display said first visual representation is disclosed. A corresponding method and computer-readable medium are also disclosed.08-26-2010
20100251189Using gesture objects to replace menus for computer control - The present invention generally comprises a computer control environment that builds on the Blackspace™ software system to provide further functionality and flexibility in directing a computer. It employs graphic inputs drawn by a user and known as gestures to replace and supplant the pop-up and pull-down menus known in the prior art.09-30-2010
20100146463WATCH PHONE AND METHOD FOR HANDLING AN INCOMING CALL IN THE WATCH PHONE - A watch phone and a method for handling an incoming call using the watch phone are provided. In the watch phone, a display device includes a touch screen panel and a display, turns off the touch screen panel in a watch mode, turns on the touch screen panel in an idle mode or upon receipt of an incoming call, and displays at least two areas for call connection and call rejection, upon receipt of the incoming call. A single mode selection key selects one of the watch mode and the idle mode. A controller performs control operations so that the touch screen panel is turned off in the watch mode and is turned on in the idle mode or upon receipt of the incoming call, and connects or rejects the incoming call, when the at least two areas for call connection or call rejection, which are displayed upon receipt of the incoming call, are pointed to or dragged to.06-10-2010
20100235793Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display - In some embodiments, a device displays content on a touch screen display and detects input by finger gestures. In response to the finger gestures, the device selects content, visually distinguishes the selected content, and/or updates the selected content based on detected input. In some embodiments, the device displays a command display area that includes one or more command icons; detects activation of a command icon in the command display area; and, in response to detecting activation of the command icon in the command display area, performs a corresponding action with respect to the selected content. Exemplary actions include cutting, copying, and pasting content.09-16-2010
20100211920Detecting and Interpreting Real-World and Security Gestures on Touch and Hover Sensitive Devices - “Real-world” gestures such as hand or finger movements/orientations that are generally recognized to mean certain things (e.g., an “OK” hand signal generally indicates an affirmative response) can be interpreted by a touch or hover sensitive device to more efficiently and accurately effect intended operations. These gestures can include, but are not limited to, “OK gestures,” “grasp everything gestures,” “stamp of approval gestures,” “circle select gestures,” “X to delete gestures,” “knock to inquire gestures,” “hitchhiker directional gestures,” and “shape gestures.” In addition, gestures can be used to provide identification and allow or deny access to applications, files, and the like.08-19-2010
20100211919RENDERING OBJECT ICONS ASSOCIATED WITH A FIRST OBJECT ICON UPON DETECTING FINGERS MOVING APART - A method comprises detecting two fingers touching a first object icon on a touch sensitive display and then moving in generally opposing directions. The first object icon is associated with on or more constituent elements. In response to such detecting, the method causes additional object icons to appear on the display. Each additional object icon represents a constituent element of the first object icon.08-19-2010
20100241999Canvas Manipulation Using 3D Spatial Gestures - User interface manipulation using three-dimensional (3D) spatial gestures may be provided. A two-dimensional (2D) user interface (UI) representation may be displayed. A first gesture may be performed, and, in response to the first gesture's detection, the 2D UI representation may be converted into a 3D UI representation. A second gesture may then be performed, and, in response to the second gesture's detection, the 3D UI representation may be manipulated. Finally, a third gesture may be performed, and, in response to the third gesture's detection, the 3D UI representation may be converted back into the 2D UI representation.09-23-2010
20100235794Accelerated Scrolling for a Multifunction Device - A computer-implemented method is performed at a multifunction device with a display and a touch-sensitive surface. The method includes detecting multiple input gestures by a user, beginning with an initial input gesture. For each input gesture after the initial input gesture, the method scrolls information on the display at a respective scrolling speed. The respective scrolling speed is determined based on the respective input gesture movement speed in the input gesture and a movement multiplier. The method determines whether the respective input gesture meets one or more swipe gesture criteria, and determines whether the respective input gesture meets one or more successive gesture criteria. When the input gesture meets the one or more swipe gesture criteria and the one or more successive gesture criteria, the method updates the movement multiplier in accordance with one or more movement multiplier adjustment criteria.09-16-2010
20100211918Web Cam Based User Interaction - This document describes tools for inputting data into a computer via the movement of features of a user as detected by a webcam or other input device. This is accomplished by a user moving his or her features in view of a webcam. The webcam or other input device then detects the presence and motion of the feature(s) and converts these motions into input signals to execute predetermined input instructions.08-19-2010
20100138798System and method for executing a game process - A 3-D imaging system for recognition and interpretation of gestures to control a computer. The system includes a 3-D imaging system that performs gesture recognition and interpretation based on a previous mapping of a plurality of hand poses and orientations to user commands for a given user. When the user is identified to the system, the imaging system images gestures presented by the user, performs a lookup for the user command associated with the captured image(s), and executes the user command(s) to effect control of the computer, programs, and connected devices.06-03-2010
20100100854GESTURE OPERATION INPUT SYSTEM - A gesture operation input system includes one or more subsystems to receive an input indicating a modifier input, receive a gesture input, wherein the gesture input indicates an action to be performed, and receive an indication that the modifier input is no longer being received. After receiving the gesture input, the gesture operation input system then determines the action to be performed using the gesture input and performs the action.04-22-2010
20090327978Hand-Held Device and Method for Operating a Single Pointer Touch Sensitive User Interface - A hand-held device and method for operating a single pointer touch sensitive user interface of a hand-held electronic device are provided. The method includes defining as being active a first one of a set of two or more controllable interface functions including at least a first controllable interface function and a second controllable interface function. A control gesture is then detected and the control gesture is associated with the active one of the set of two or more controllable interface functions, where the detected pattern adjusts the performance of the active controllable interface function. A transition gesture is then detected including a pointer pattern movement, which is not included as a control gesture for any of the two or more controllable interface functions, where upon detection of the transition gesture, the transition gesture defines a second one of the set of two or more controllable interface functions as being the active one of the set of two or more controllable interface functions. A further control gesture is then detected and the control gesture is associated with the active one of the set of two or more controllable interface functions, where the detected pattern adjusts the performance of the active controllable interface function.12-31-2009
20090327976Portable Device, Method, and Graphical User Interface for Displaying a Portion of an Electronic Document on a Touch Screen Display - A computer-implemented method for use in conjunction with a computing device with a touch screen display comprises displaying a portion of a web page in a web browser application without concurrently displaying a Uniform Resource Locator (URL) entry area for inputting URLs of web pages. A gesture is detected in a predefined area at the top of the touch screen display. In response to detecting the gesture in the predefined area at the top of the touch screen display, the URL entry area is displayed.12-31-2009
20100037185INPUT METHOD FOR COMMUNICATION DEVICE - An input method for a communication device includes the steps of displaying a result box and a dial ring comprising a plurality of buttons on a touch screen of the communication device; receiving a touch and slide operation on a button; recognizing the touch and slide operation and rotating the dial ring according to the slide operation; and displaying information represented by the touched button in the result box when the touched button rotates to a predetermined position. People who are used to rotary dial phone can use the input method without difficulty.02-11-2010
20100269072USER INTERFACE DEVICE, USER INTERFACE METHOD, AND RECORDING MEDIUM - A user interface device (10-21-2010
20090282371INTEGRATION SYSTEM FOR MEDICAL INSTRUMENTS WITH REMOTE CONTROL - An integration system for medical instruments is described. In various embodiments, the integration system is useful for managing information from, and controlling, multiple medical instruments in a medical facility, as well as providing high fidelity audio communications between members of a clinical team. The system can be operated remotely in a sterile environment using gesture-based control and/or voice-recognition control. The system can record combined data instrument data, clinical data, system data, video and audio signals from a surgical procedure synchronously, as the data would be perceived during the procedure, in a central database. The recorded data can be retrieved and reviewed for instructional, diagnostic or analytical purposes.11-12-2009
20090327977INTERACTIVE CONTROL DEVICE AND METHOD FOR OPERATING THE INTERACTIVE CONTROL DEVICE - An interactive control device includes a display device, and a method is for operating the interactive control device. The method includes: displaying graphical information on the display device; receiving sensor information; activating a control action if on the basis of the sensor information it is ascertained that a body part of a user is located within an activation region that is spatially defined relative to a display region of a control element on the display device with which the control action is associated; the received sensor information including user information that is evaluated prior to an activation of the control action in order to ascertain a control intention for the at least one control element; and the information represented on the display device being adapted as a function of the ascertained control intention such that the at least one control element is represented in a manner optimized for the activation of the control action associated with the control element. The control device may be arranged as a component of a motor vehicle console so as to be able to implement the control method.12-31-2009
20090327975Multi-Touch Sorting Gesture - A method and apparatus are provided for recognizing multi-touch gestures on a touch sensitive display. A plurality of graphical objects is displayed within a user interface (UI) of a display screen operable to receive touch input. A first touch input exceeding a first time duration is detected over a first graphical object. A touch-and-hold gesture action is generated, which is then applied to the first graphical object. A second touch input is then detected over a second graphical object and a touch-select gesture action is generated, which is then applied to the second graphical object. The first and second gestures are processed to determine an associated operation, which is then performed on the second graphical object.12-31-2009
20090327974USER INTERFACE FOR GESTURAL CONTROL - A UI (user interface) for gestural control enhances the navigation experience for the user by preventing multiple gestures from being inadvertently invoked at the same time. This problem is overcome by establishing two or more categories of gestures. For instance, the first category of gestures may include gestures that are likely to be invoked before gestures that are included in the second category of gestures. That is, gestures in the second category will typically be invoked after a gesture in the first category has already been invoked. One example of a gesture that falls into the first category may be a gesture that initiates operation of a device, whereas a gesture that falls into the second category may be change in volume. Gestures that fall into the second category require more criteria to be satisfied in order to be invoked than gestures that fall into the first category.12-31-2009
20110066985Systems, Methods, and Mobile Devices for Providing a User Interface to Facilitate Access to Prepaid Wireless Account Information - A method for operating a mobile device includes, in response to receiving a swipe gesture via a user interface of the mobile device, displaying a balance of a prepaid wireless service account on a display of the mobile device. The balance may be displayed in a currency such as monetary currency or a proprietary currency such as minutes or credits provided by a wireless service provider. The method may also include displaying an expiration time to identity a time remaining until the balance expires. The method may also include generating a balance request including a request for the balance of the prepaid wireless service account from a prepaid billing system in response to receiving the swipe gesture, transmitting the balance request to the prepaid billing system, and receiving the balance in response to the balance request.03-17-2011
20110066984Gesture Recognition on Computing Device - A computer-implemented user interface method is disclosed. The method includes displaying information on a touchscreen of a computing device, receiving from a user of the device an input drawn on the touchscreen, correlating the input to a template, where the correlating includes employing a closed-form solution to find a rotation that reduces angular distance between the input and the template, and providing output based on a result of the correlating.03-17-2011
20090113355METHOD AND APPARATUS FOR CONTROLLING MULTI-TASKING OPERATION FOR TERMINAL DEVICE PROVIDED WITH TOUCH SCREEN - A method of controlling a terminal device, and which includes executing a first function on the terminal device, displaying at least one function icon for executing at least one second function that is different than the first function being executed, and selectively executing the second function simultaneously with the first function when said at least one function icon is selected.04-30-2009
20090113354BROADCAST RECEIVING APPARATUS AND CONTROL METHOD THEREOF - A broadcast receiving apparatus wherein a user is able to control a graphical user interface (GUI) using a pointing device, and a control method thereof are provided. The broadcast receiving apparatus receives a movement pattern from a pointing device, and operates a function corresponding to the pattern. The broadcast receiving conveniently switches a pointing device between a position mode and a step mode, without having a key for switching between the position mode and the step mode or direction keys.04-30-2009
20080288896Method And System For Attention-Free User Input On A Computing Device - A method and system for attention-free user input on a computing device is described that allows the recognition of a user input irrespective of the area of entry of the user input on a writing surface (such as a digitizer) without the user having to make a visual contact with the writing surface.11-20-2008
20100223582SYSTEM AND METHOD FOR ANALYZING MOVEMENTS OF AN ELECTRONIC DEVICE USING ROTATIONAL MOVEMENT DATA - The disclosure relates to a system and method for analyzing movements of a handheld electronic device. The system comprises: memory; a microprocessor; a first module to generate movement data responsive to movements of the device, such as rotational movements; a second module providing instructions to the microprocessor to map the movement data against symbols representing an input movement string and store the string representation in the memory; and a third module. The third module provides instructions to the microprocessor to analyze data relating to the string representation against data relating to a gesture string representing a gesture related to a command for the device to determine if the gesture has been imparted on the device; and if the string representation sufficiently matches the gesture string, executes a command associated with the gesture on the device.09-02-2010
20120144347DISPLAY DEVICE AND CONTROL METHOD THEREOF - A display device and a control method thereof are provided. The display device includes: a touch screen which displays a screen and senses a gesture of a user on the screen; a video processor which process an image for displaying the screen; a communication unit which performs communication with at least one neighboring device; and a controller which performs control to display a miniature image of a screen of contents being displayed and a first user interface (UI) item showing a connected neighboring device of the at least one neighboring device if a first gesture of a user is made while displaying the screen corresponding to predetermined contents, and to transmit information for sharing the contents to the corresponding neighboring device in accordance with a second gesture of a user with regard to the miniature image and the first UI item.06-07-2012
20080244467METHOD FOR EXECUTING USER COMMAND ACCORDING TO SPATIAL MOVEMENT OF USER INPUT DEVICE AND IMAGE APPARATUS THEREOF - A method for executing a user command based on spatial movement of a user input device and an image apparatus having the same are provided. According to the method for executing a user command, a user command which is determined based on the spatial movement of the user input device is executed. Accordingly, a method for inputting a user command becomes more diverse and convenient to use, and a more compact user input device may be provided.10-02-2008
20110119641Call connection method and apparatus in mobile terminal - A call connection method and apparatus of a portable terminal capable of reducing errors due to automatic call connection using a user gesture and recognition of the user gesture are provided. The call connection method of a portable terminal includes providing a list according to a user request, sequentially sensing a first event and a second event according to a user gesture performed after a specific object in the list is selected, and performing automatic call connection based on the specific object when the first event and the second event are sensed.05-19-2011
20110119640DISTANCE SCALABLE NO TOUCH COMPUTING - Disclosed herein are techniques for scaling and translating gestures such that the applicable gestures for control may vary depending on the user's distance from a gesture-based system. The techniques for scaling and translation may take the varying distances from which a user interacts with components of the gesture-based system, such as a computing environment or capture device, into consideration with respect to defining and/or recognizing gestures. In an example embodiment, the physical space is divided into virtual zones of interaction, and the system may scale or translate a gesture based on the zones. A set of gesture data may be associated with each virtual zone such that gestures appropriate for controlling aspects of the gesture-based system may vary throughout the physical space.05-19-2011
20080229255Apparatus, method and system for gesture detection - Apparatuses, methods, and computer program products are provided to sense orientations or sequence of orientations, i.e. gestures, of mobile devices. The orientation or sequence of orientations control components and/or functions of the mobile device. Indications may be provided to a user to inform the user that the mobile device is in a particular orientation, or that the user has successfully performed a sequence of orientations corresponding to a functionality of the mobile device. The orientation or sequence of orientations may be performed while the mobile device is in a locked or idle state in order to control components and/or functions of the mobile device. A low energy sensor may activate the mobile device after a particular orientation is achieved.09-18-2008
20100299641PORTABLE ELECTRONIC DEVICE AND METHOD OF CONTROLLING SAME - A method of controlling a portable electronic device having a touch-sensitive display includes rendering content on the touch-sensitive display, detecting a first touch at a first location on the touch-sensitive display, detecting a second touch at a second location on the touch-sensitive display during the first touch, determining an area having a boundary defined by the first location and the second location, when a zoom selection is detected, performing a zooming operation by expanding rendered content in at least the area.11-25-2010
20100306711 Method and Apparatus for a Motion State Aware Device - A device comprising a motion context logic that receives data from at least one motion sensor is described. The motion context logic determines a user's motion context. Context based action logic manages the device based on the user's motion context.12-02-2010
20100306716EXTENDING STANDARD GESTURES - In a system that utilizes gestures for controlling aspects of an application, strict requirements for success may limit approachability or accessibility for different types of people. The system may receive data reflecting movement of a user and remap a standard gesture to correspond to the received data. Following the remapping, the system may receive data reflecting skeletal movement of a user, and determine from that data whether the user has performed one or more standard and/or remapped gestures. In an exemplary embodiment, a gesture library comprises a plurality of gestures. Where these gestures are complementary with each other, they may be grouped into gesture packages. A gesture package may include gestures that are packaged as remapped gestures or a gesture package may include options for remapping standard gestures to new data.12-02-2010
20100306715Gestures Beyond Skeletal - Systems, methods and computer readable media are disclosed for gesture input beyond skeletal. A user's movement or body position is captured by a capture device of a system. Further, non-user-position data is received by the system, such as controller input by the user, an item that the user is wearing, a prop under the control of the user, or a second user's movement or body position. The system incorporates both the user-position data and the non-user-position data to determine one or more inputs the user made to the system.12-02-2010
20100306714Gesture Shortcuts - Systems, methods and computer readable media are disclosed for gesture shortcuts. A user's movement or body position is captured by a capture device of a system, and is used as input to control the system. For a system-recognized gesture, there may be a full version of the gesture and a shortcut of the gesture. Where the system recognizes that either the full version of the gesture or the shortcut of the gesture has been performed, it sends an indication that the system-recognized gesture was observed to a corresponding application. Where the shortcut comprises a subset of the full version of the gesture, and both the shortcut and the full version of the gesture are recognized as the user performs the full version of the gesture, the system recognizes that only a single performance of the gesture has occurred, and indicates to the application as such.12-02-2010
20100306713Gesture Tool - Systems, methods and computer readable media are disclosed for a gesture tool. A capture device captures user movement and provides corresponding data to a gesture recognizer engine and an application. From that, the data is parsed to determine whether it satisfies one or more gesture filters, each filter corresponding to user-performed gesture. The data and the information about the filters is also sent to a gesture tool, which displays aspects of the data and filters. In response to user input corresponding to a change in a filter, the gesture tool sends an indication of such to the gesture recognizer engine and application, where that change occurs.12-02-2010
20100306712Gesture Coach - A capture device may capture a user's motion and a display device may display a model that maps to the user's motion, including gestures that are applicable for control. A user may be unfamiliar with a system that maps the user's motions or not know what gestures are applicable for an executing application. A user may not understand or know how to perform gestures that are applicable for the executing application. User motion data and/or outputs of filters corresponding to gestures may be analyzed to determine those cases where assistance to the user on performing the gesture is appropriate.12-02-2010
20100306717STORAGE MEDIUM HAVING GAME PROGRAM STORED THEREIN AND GAME APPARATUS - A game apparatus detects a path inputted by a player, and moves an object placed in a virtual game space along the path. Moreover, the game apparatus controls the object, which is moving along the path, to perform a predetermined action, and determines a return position when the predetermined action is finished. The return position is a position at which the object having finished the predetermined action returns to the path, and is determined from among positions along the path. The game apparatus resumes the movement of the object along the path after returning the object, having finished the predetermined action, to the return position.12-02-2010
20100306718APPARATUS AND METHOD FOR UNLOCKING A LOCKING MODE OF PORTABLE TERMINAL - A method of unlocking a locking mode of a portable terminal, which includes sensing a user's gesture input which is set in a locking mode of the portable terminal. The locking mode is unlocked in response to the user's gesture input, and a function mapped to the user's gesture can be executed when unlocking the locking mode. A portable terminal compares gestures among predefined sets of gesture information in order to check whether there is a gesture that coincides with the analyzed gesture.12-02-2010
20120272193I/O DEVICE FOR A VEHICLE AND METHOD FOR INTERACTING WITH AN I/O DEVICE - An I/O device for a vehicle includes at least one touch-sensitive I/O display unit DT, at least one output display unit DI and a control unit CU connecting the I/O display unit DT and the output display unit DI with an information exchange unit IEU. The touch-sensitive I/O display unit DT is located in a readily reachable position for a driver and the output display unit DI is located in a readily discernible position for a driver, and the control unit CU communicates output data DD-I/O related to an interactive I/O communication to the I/O display unit DT, receives touchscreen input data TI from the I/O display unit DT and communicates output data DD-O to the output display unit DI in relation with the input data TI. An I/O method using the above mentioned I/O device for a vehicle is also provided.10-25-2012
20100281438ALTERING A VIEW PERSPECTIVE WITHIN A DISPLAY ENVIRONMENT - Disclosed herein are systems and methods for altering a view perspective within a display environment. For example, gesture data corresponding to a plurality of inputs may be stored. The input may be input into a game or application implemented by a computing device. Images of a user of the game or application may be captured. For example, a suitable capture device may capture several images of the user over a period of time. The images may be analyzed and processed for detecting a user's gesture. Aspects of the user's gesture may be compared to the stored gesture data for determining an intended gesture input for the user. The comparison may be part of an analysis for determining inputs corresponding to the gesture data, where one or more of the inputs are input into the game or application and cause a view perspective within the display environment to be altered.11-04-2010
20100281437MANAGING VIRTUAL PORTS - Techniques for managing virtual ports are disclosed herein. Each such virtual port may have different associated features such as, for example, privileges, rights or options. When one or more users are in a capture scene of a gesture based system, the system may associate virtual ports with the users and maintain the virtual ports. Also provided are techniques for disassociating virtual ports with users or swapping virtual ports between two or more users.11-04-2010
20130139115RECORDING TOUCH INFORMATION - A method of recording user-driven events within a computing system includes receiving at a motion-sensitive display surface at least one user-performed gesture, which includes user movement of an object over the surface that recognizes such user interaction therewith. Touch information is generated corresponding to the at least one user-performed gesture. The touch information is configured to be provided to an application. The touch information is intercepted and recorded before it is provided to the application. The intercepted touch information is grouped into at least one chunk, and the at least one chunk is output to the application.05-30-2013
20100333043Terminating a Communication Session by Performing a Gesture on a User Interface - There is disclosed a wireless communication device for communicating with one or more remote devices. The device comprises a touch-sensitive surface, a user interface, and a transceiver. The user interface produces an input signal in response to detecting a predetermined gesture at the touch-sensitive surface. The transceiver communicates wirelessly with a remote device and terminates communication with the remote device in response to the input signal from the user interface. The device determines that it is communicating the remote device, detects the predetermined gesture at the touch-sensitive surface, and terminates communication with the remote device in response to detecting the predetermined gesture while communicating with the remote device. The predetermined gesture includes continuous contact at the touch-sensitive surface between discrete locations of the surface.12-30-2010
20110010676SYSTEM AND METHOD FOR ALLOCATING DIGITAL GRAFFITI OBJECTS AND CANVASSES - The subject specification provides a system, method, and computer readable storage medium directed towards allocating digital canvasses for digital graffiti. The specification discloses receiving data corresponding to digital graffiti formed from a gesture undergone by a device. The specification also discloses identifying a digital canvas corresponding to the digital graffiti as a function of the received data.01-13-2011
20110029934Finger Touch Gesture for Joining and Unjoining Discrete Touch Objects - An approach is provided to join graphical user interface objects into a group. A request is received at a touch-enabled display screen to join a first graphical user interface object with a second graphical user interface object. The request is from a user of the system. The first and second graphical user interface objects are then associated with each other. The first and second graphical user interface objects are displayed on the touch-enabled display screen adjacent to each other and a visual indicator is also displayed near the objects that indicates that the objects have been joined in a group.02-03-2011
20110041101MOBILE TERMINAL AND CONTROLLING METHOD THEREOF - A mobile terminal and controlling method thereof are disclosed. The present invention includes displaying a plurality of objects on a touchscreen and if a first user command is inputted, controlling the objects pertaining to a category corresponding to the first user command among a plurality of the objects displayed on the touchscreen to move into a specific region on the touchscreen. According to at least one of embodiments of the present invention, even if numerous icons for executing diverse functions are displayed in a touchscreen type mobile terminal, the present invention facilitates a terminal user to discover a specific icon from the numerous icons.02-17-2011
20110041102MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME - A mobile terminal and a method for controlling the same are disclosed, which facilitates diverse functions thereof to be registered in a touch gesture and functions, information and menu icons registered in the touch gesture to be arranged and displayed adjacent to the input touch gesture, when the touch gesture is inputted on the touchscreen.02-17-2011
20090222770METHOD OF INPUTTING CONTROL INSTRUCTION AND HANDHELD DEVICE THEREOF - A method of inputting a control instruction and a handheld device thereof are provided. The handheld device includes a memory unit, a touch module, and a recognition module. The method includes receiving a writing track input by the user from a touch module, analyzing the writing track by the recognition module to convert the writing track into a track data, and comparing the track data with a feature data stored in the memory unit to judge whether the two are consistent with each other, so as to determine whether to execute a program instruction corresponding to the feature data. Through the handheld device and method, when a user inputs a writing track, the handheld device activates a corresponding application program and specific actions thereof, so as to reduce the time of searching for the application program, thereby enhancing the practicability of the handheld device to the user.09-03-2009
20110214092System and Method for Management of User Interactions Using Configurable Listeners in a Data Processing System - A system, method, and computer program product for management of user interactions with a data processing system. A method includes loading a listener dependency definition for a user interaction listener in a data processing system, and initializing listener lookup information for the user interaction listener. The method includes detecting a defined user interaction event by the user interaction listener, and handling the detected defined user interaction event by performing a corresponding defined action.09-01-2011
20110119639SYSTEM AND METHOD OF HAPTIC COMMUNICATION AT A PORTABLE COMPUTING DEVICE - A method of haptic communication at a wireless device is disclosed. The method may include receiving an input gesture and generating an input gesture message from the input gesture. The input gesture message may be operable for transmission to a receiving wireless device.05-19-2011
20110119637METHOD AND APPARATUS FOR INTERACTING WITH A CONTENT OBJECT - An approach is provided for interacting with an embedded content object. A request is received, from a device, to access an embedded content object, wherein the content object is related to a content playlist. On receipt of the request, the content object determines whether its content is available. The content object then causes, at least in part, actions that result in an interaction behavior based on the determination.05-19-2011
20100146460SYSTEM AND METHOD FOR MODIFYING A PLURALITY OF KEY INPUT REGIONS BASED ON DETECTED TILT AND/OR RATE OF TILT OF AN ELECTRONIC DEVICE - A system, method and computer program that utilizes motion detection circuitry to dynamically update displayed labels on one or more key input regions. In one aspect of the invention, the number of key input regions is substantially less than the number of keys on a conventional QWERTY keypad and the labels on the key input regions dynamically change based on the detected motion of the motion detection circuitry.06-10-2010
20110083111USER INTERFACE GESTURES AND METHODS FOR PROVIDING FILE SHARING FUNCTIONALITY - Methods and devices provide a gesture activated file sharing functionality enabling users to share files with other nearby computing devices. The file sharing functionality may include establishing wireless links with nearby devices and determine their relative locations. The computing device may detect a file sharing gesture and transmit files to or request files from a nearby device in response to the gesture. Base on gesture parameters, e.g., direction, speed and shape, and computing device attitude parameters, e.g., tilt angle and pointing direction, the computing device may identify a targeted device to which a file may be transmitted. The computing device may request user verification of the identified device and send a request to transmit files to the targeted device. The computing devices may transmit files using networks and addresses provided over the device-to-device communication links.04-07-2011
20110083112INPUT APPARATUS - An input apparatus including an input unit to which a predetermined motion image signal is input, a motion detection unit for detecting a motion from the motion image signal which is input to the input unit, a video signal processing unit for outputting a predetermined video signal, and a control unit, wherein if a hand revolving motion of an operator is detected, the control unit controls the video signal processing unit to output a predetermined first video signal in synchronism with the hand revolving motion in order to inform the operator of a detection situation of the revolving motion and to output a predetermined second video signal in synchronism with the first video signal in order to inform the operator of a progress situation of a manipulation until a predetermined manipulation is definitely fixed.04-07-2011
20100138797PORTABLE ELECTRONIC DEVICE WITH SPLIT VISION CONTENT SHARING CONTROL AND METHOD - A portable electronic device, such as a mobile phone, has a main camera and a video call camera that are receive optical input representative of motion of a user's hand(s) or hand gestures. The motion or gestures are decoded and used as a remote control input to control the displaying of content by a display device, such as a television or a projector, which receives the content for display from the mobile phone. A method of displaying content from a portable electronic device on a separate display or projector and of controlling such displaying by remote control based on hand movement or gestures.06-03-2010
20090077504Processing of Gesture-Based User Interactions - Systems and methods for processing gesture-based user interactions with an interactive display are provided.03-19-2009
20110083110TOUCH-SENSITIVE DISPLAY AND METHOD OF CONTROL - A method includes displaying one or more selection options on a touch-sensitive display and detecting a hovering touch associated with a first option of the one or more selection options. Information associated with the first option is previewed in a first format in an information field in response to detecting the hovering touch. A selection of one of the one or more selection options is detected, and a function associated with the selected option is performed.04-07-2011
20090031258GESTURE ACTIVATED CLOSE-PROXIMITY COMMUNICATION - A system for establishing a link from a wireless communication device (WCD) to at least one target device that is a member of a particular user group. The process of both locating the target device and establishing a link may incorporate the orientation and/or movement of the WCD into the procedure in lieu of the extensive use of traditional menu interfaces. For example, a WCD may recognize a combination of orientation and/or movement changes as a pattern for triggering activities, such as scanning for other devices. Various movement patterns may also be employed to establish a wireless link and for further interaction between users/devices in a user group.01-29-2009
20100037184PORTABLE ELECTRONIC DEVICE AND METHOD FOR SELECTING MENU ITEMS - A portable electronic device includes a motion detection module and a storage system. The motion detection module is configured for determining a direction of movement of the portable electronic device when orientation of the portable electronic device has been changed. The motion detection module is further configured for generating an input signal associated with the movement and providing the input signal to an application of the portable electronic device to initiate an operation performed by the application, wherein the input signal includes menu position information of a menu item of the application. The motion detection module is further configured for selecting a desired menu item according to the menu position. The storage system is used for storing the application and movement data of the portable electronic device.02-11-2010
20090217211ENHANCED INPUT USING RECOGNIZED GESTURES - Enhanced input using recognized gestures, in which a user's gesture is recognized from first and second images, and a representation of the user is displayed in a central region of a control that further includes interaction elements disposed radially in relation to the central region. The enhanced input also includes interacting with the control based on the recognized user's gesture, and controlling an application based on interacting with the control.08-27-2009
20090217210SYSTEM AND METHOD FOR TELEVISION CONTROL USING HAND GESTURES - Systems and method which allow for control of televisions and other media device are disclosed. A television set is provided with a gesture capture device configured to receive a gesture input directed to at least one of a plurality of predefined areas related to the television set. The television set further includes a user interaction interface configured to generate data indicative of the gesture input directed toward the at least one of the predefined areas and control the television based at least in part on the generated data.08-27-2009
20100058252GESTURE GUIDE SYSTEM AND A METHOD FOR CONTROLLING A COMPUTER SYSTEM BY A GESTURE - A gesture guide system and a method for controlling a computer system by a gesture are provided. The system includes a sensor element and a computer system. The method includes steps of: communicating the sensor element with the computer system; the computer system shows at least one gesture option and the corresponding function instruction; the sensor element detecting a gesture of the user; and the computer system executes the corresponding function instruction in response to the detected gesture.03-04-2010
20100058251OMNIDIRECTIONAL GESTURE DETECTION - An omnidirectional electronic device is disclosed. The electronic device can perform operations associated with a combination of inputs that can, in some cases, be recognized irrespective of the position or orientation in which they are applied to the electronic device. The inputs can include, for example, single or multi-touch taps, presses, swipes, rotations, characters and symbols. The inputs can be provided one or more times in succession and can be held for an amount of time. In one embodiment, an omnidirectional media player can perform media operations associated with a combination of inputs that can be recognized irrespective of the position or orientation in which they are applied to an input area of the media player.03-04-2010
20100064262Optical multi-touch method of window interface - An optical multi-touch method of a window interface is adapted to control an object in the window interface. The method includes providing a first optical sensing window to obtain a first tracking signal and providing a second optical sensing window to obtain a second tracking signal; resolving the first tracking signal to determine a first displacement direction and resolving the second tracking signal to determine a second displacement direction; and controlling a motion of the object in the window interface according to a relative relation between the first displacement direction and the second displacement direction.03-11-2010
20100064261PORTABLE ELECTRONIC DEVICE WITH RELATIVE GESTURE RECOGNITION MODE - A computer program executable on a portable electronic device having a touch screen sensor is provided. The computer program may include an input mode switching module configured to receive a mode switch user input to switch between a direct input mode and a relative gesture recognition mode. The computer program may further include a gesture-based control module configured, in the relative gesture recognition mode, to recognize a contact point on the touch screen sensor between a digit of a user and a surface of the touch screen sensor in a defined region in which the graphical user interface elements are unselectable, and to identify a detected gesture based on user touch input originating from the contact point, and to send a message to an application program to adjust an operation of the portable electronic device based on the detected gesture.03-11-2010
20100058254Information Processing Apparatus and Information Processing Method - To provide an information processing apparatus and information processing method capable of rapidly and easily zooming in or out an image displayed on a display unit. The apparatus includes a display unit 03-04-2010
20100058253MOBILE TERMINAL AND METHOD FOR CONTROLLING MUSIC PLAY THEREOF - A mobile terminal is provided including a display unit, a sensing unit, and a controller. The display unit is configured as a touch screen for displaying album art of a song currently being played. The sensing unit is for sensing a touch applied to the touch screen. The controller is for controlling play of the song based on a touch sensed at a certain region of the album art displayed on the display unit.03-04-2010
20110252383INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM - There is provided an information processing apparatus including a display section which displays, as a first layout state, an object group including a plurality of objects arranged in a first direction, a detection section which detects an operation input that is input to the display section, and a control section which, when the detection section detects an operation input in a second direction that is perpendicular to the first direction, changes the first layout state into a second layout state in which the respective objects constituting the object group which has been selected are spread and pieces of information associated with the plurality of objects, respectively, are displayed.10-13-2011
20110088002METHOD AND PLATFORM FOR GESTURAL TRANSFER OF DIGITAL CONTENT FOR MOBILE DEVICES - A platform is provided which allows for gesture-initiated transfer of digital content from a mobile device to at least one other device which may also be a mobile device. The platform includes an application that leverages components which help in determining the pose of the mobile device. Upon detecting a gesturing motion (e.g., a throwing or casting motion), the system begins transfer of digital content (such as the current application or a set of pre-packaged information) to the at least one other device. The throwing or casting direction is analyzed to determine the appropriate device or devices to receive the content.04-14-2011
20110154266CAMERA NAVIGATION FOR PRESENTATIONS - Techniques for managing a presentation of information in a gesture-based system, where gestures are derived from a user's body position or motion in the physical space, may enable a user to use gestures to control the manner in which the information is presented or to otherwise interact with the gesture-based system. A user may present information to an audience to an audience using gestures that control aspects of the system, or multiple users may work together using gestures to control aspects of the system. Thus, in an example embodiment, a single user can control the presentation of information to the audience via gestures. In another example embodiment, multiple participants can share control of the presentation via gestures captured by a capture device or otherwise interact with the system to control aspects of the presentation.06-23-2011
20110154268METHOD AND APPARATUS FOR OPERATING IN POINTING AND ENHANCED GESTURING MODES - Methods and apparatuses for implementing gesture command recognition functionality is disclosed. The apparatuses may operate in a pointing mode and operate in an enhanced gesturing mode. While in the enhanced gesturing mode, the apparatuses may cause associated actions in response to recognizing sliding inputs as gesture commands. The gesture commands may be selectively associated with actions based on localities. The apparatuses may present overlays with information content independent of gesture command recognition. The apparatuses may change appearances of visual representations of sliding inputs in response to recognizing the sliding inputs as gesture commands.06-23-2011
20110154267Method and Apparatus for Determining an Operation Associsated with a Continuous Stroke Input - An apparatus, comprising a processor and memory including computer program code, the memory and the computer program code configured to, working with the processor, cause the apparatus to perform at least the following: receiving indication of a first input associated with a first touch display, receiving indication of a second input relating to an exiting touch display boundary input associated with the first touch display, receiving indication of a third input relating to an entering touch display boundary input associated with a second touch display, receiving indication of a fourth input associated with the second display, determining that a continuous stroke input comprises the first input, second input, third input, and fourth input, and determining an operation based, at least in part, on the continuous stroke input is disclosed.06-23-2011
20120304133EDGE GESTURE - This document describes techniques and apparatuses enabling an edge gesture. In some embodiments, these techniques and apparatuses enable selection of a user interface not currently exposed on a display through an edge gesture that is easy-to-use and remember.11-29-2012
20120304132SWITCHING BACK TO A PREVIOUSLY-INTERACTED-WITH APPLICATION - This document describes techniques and apparatuses for switching back to a previously-interacted-with application. In some embodiments, these techniques and apparatuses enable selection of a user interface not currently exposed on a display through a simple gesture that is both easy-to-use and remember.11-29-2012
20120304131EDGE GESTURE - This document describes techniques and apparatuses enabling an edge gesture. In some embodiments, these techniques and apparatuses enable selection of a user interface not currently exposed on a display through an edge gesture that is easy-to-use and remember.11-29-2012
20110072400METHOD OF PROVIDING USER INTERFACE OF MOBILE TERMINAL EQUIPPED WITH TOUCH SCREEN AND MOBILE TERMINAL THEREOF - A method of providing an interface in a terminal equipped with a touch screen, including displaying a first screen when receiving an unlock input for displaying the preset first screen. A screen change input for changing a part or an entire of the first screen into a preset second screen is received through the touch screen displaying the first screen; and changing a part of screen displayed on the touch screen from the first screen into the second screen according to a progress of the screen change input.03-24-2011
20110061029GESTURE DETECTING METHOD FOR TOUCH PANEL - A gesture detecting method for a touch panel is provided. Firstly, a command mode of the touch panel is established based on a hop touch with fingers sequentially touching the touch panel. Then, a gesture is determined according to an eventually detected touch result of a single touch or multipoint touch, i.e., a detected moving track of the touch points, so as to generate and transmit a gesture instruction.03-10-2011
20110055773DIRECT MANIPULATION GESTURES - The present disclosure describes various techniques that may be implemented to execute and/or interpret manipulation gestures performed by a user on a multipoint touch input interface of a computing device. An example method includes receiving a multipoint touch gesture at a multipoint touch input interface of a computing device, wherein the multipoint touch gesture comprises a gesture that is performed with multiple touches on the multipoint touch input interface, and resolving the multipoint touch gesture into a command. The example method further includes determining at least one physical simulation effect to associate with the resolved multipoint touch gesture, and rendering a unified feedback output action in a graphical user interface of the computing device by executing the command, wherein the unified feedback output action includes at least a graphical output action incorporated with the at least one physical simulation effect in the graphical user interface.03-03-2011
20130159941ELECTRONIC DEVICE AND METHOD OF DISPLAYING INFORMATION IN RESPONSE TO A GESTURE - A method includes displaying, on a display of an electronic device, first information and detecting a gesture on the touch-sensitive display, which gesture indicates a request to display an inbox associated with a plurality of applications. In response to detecting the gesture, when a message is received for a first application that is not one of the plurality of applications, a plurality of visual notification icons is displayed and at least part of the inbox is gradually displayed while reducing display of the first information along with movement of the gesture, wherein a first visual notification icon of the plurality of visual notification icons is associated with the first application.06-20-2013
20100325590OPERATION CONTROL DEVICE, OPERATION CONTROL METHOD, AND COMPUTER-READABLE RECORDING MEDIUM - An apparatus and method provide logic for controlling a controllable device by distinguishing between an intended motion of a user and an unintended motion of the user. In one implementation, a computer-implemented method is provided to control a controllable device by distinguishing between a control movement and a non-control movement. The method receives spatial positions of a joint of a human appendage and a reference point disposed along the appendage and distal to the joint. The method determines whether a movement of the reference point about the joint is a control movement or a non-control movement, based on a comparison of direction of movement of the reference point and a direction of displacement between the reference point and the upper joint. A control instruction is executed when the movement is a control movement.12-23-2010
20100281439Method to Control Perspective for a Camera-Controlled Computer - Systems, methods and computer readable media are disclosed for controlling perspective of a camera-controlled computer. A capture device captures user gestures and sends corresponding data to a recognizer engine. The recognizer engine analyzes the data with a plurality of filters, each filter corresponding to a gesture. Based on the output of those filters, a perspective control is determined, and a display device displays a new perspective corresponding to the perspective control.11-04-2010
20110161891INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - According to one embodiment, the relationship between respective fingers of a user and corresponding screens is registered, a screen and command are determined according to the motion (motion trajectory) of a finger that touches a touch input device and the type of the finger, and a multi-screen is operated.06-30-2011
20120204133Gesture-Based User Interface - A user interface method, including capturing, by a computer, a sequence of images over time of at least a part of a body of a human subject, and processing the images in order to detect a gesture, selected from a group of gestures consisting of a grab gesture, a push gesture, a pull gesture, and a circular hand motion. A software application is controlled responsively to the detected gesture.08-09-2012
20100070931METHOD AND APPARATUS FOR SELECTING AN OBJECT - A device and method for selecting objects on a touch-sensitive display are described. The device includes a touch-sensitive display for displaying at least one object, the touch-sensitive display responsive to a user input, a selection detection section operatively coupled to the touch-input display, the selection detection section configured to detect a back-and-forth movement of the user input when the input is in contact with the touch-sensitive display and to select an object on the touch-sensitive display for further operation when the back-and-forth motion is in proximity to the object.03-18-2010
20090282370GRAPHICAL USER INTERFACE FOR DATA ENTRY - A graphical user interface is provided for facilitating entry of data into a telephone, personal digital assistant or other computing device having a touch-sensitive input component (e.g., a touch screen). The interface includes multiple initial contact areas associated with different input (e.g., characters, numerical values, commands), a home area and spokes positioned between the initial contact areas and the home area. The interface is manipulated using gestures. A data input gesture begins by touching in or near an initial contact area and moving to or toward the home area, generally in proximity to the corresponding spoke. Other illustrative gestures include tracing directly from one initial contact area to another (e.g., to add the corresponding data values), performing a “throwing” gesture out of the home area (e.g., to delete the last input), gesturing backward/forward in the home area (e.g., to move backward/forward through a series of fields), etc.11-12-2009
20080320419Touch Screen Device, Method, and Graphical User Interface for Providing Maps, Directions, and Location-Based Information - A device, method, and graphical user interface for providing maps, directions, and location-based information on a touch screen display are disclosed.12-25-2008
20090138831Apparatus and method of determining a user selection in a user interface - A user interface (05-28-2009
20110161889User Interface for Electronic Devices - An electronic device having a user interface and a display unit on which an object is selected from a source screen in response to a first input at the user interface. The selected object is then tunneled to a target screen, via a virtual tunnel, in response to a second input at the user interface. The source screen and the target screen may be a part of the display unit in the electronic device. The tunneled object is then edited or modified to create an object desired by the user.06-30-2011
20110047517METADATA TAGGING SYSTEM, IMAGE SEARCHING METHOD AND DEVICE, AND METHOD FOR TAGGING A GESTURE THEREOF - A metadata tagging system, an image searching method, a device, and a gesture tagging method are provided. The metadata tagging system includes a first device which tags metadata to an image and transmits the image tagged with the metadata and a second device which allows at least one image from among stored images to be searched. Accordingly, generated data may be searched and used more easily and conveniently.02-24-2011
20120311507Devices, Methods, and Graphical User Interfaces for Navigating and Editing Text - An electronic device displays text of an electronic document on a display; displays an insertion marker at a first position in the text of the electronic document; detects a first horizontal gesture on a touch-sensitive surface; in response to a determination that the first horizontal gesture satisfies a first set of one or more predefined conditions: translates the electronic document on the display in accordance with a direction of the first horizontal gesture, and maintains the insertion marker at the first position in the text; and, in response to a determination that the first horizontal gesture satisfies a second set of one or more predefined conditions, moves the insertion marker by one character in the text from the first position to a second position in the text in accordance with the direction of the first horizontal gesture.12-06-2012
20120311509READER WITH ENHANCED USER FUNCTIONALITY - A bookmarking system including a reader interface configured to cause content to be displayed on an electronic device having a touch-sensitive screen. The system includes a bookmark module configured to add a bookmark when a user swipes downwardly on the screen, wherein the bookmark is a record relating to the content displayed on the screen during the downward swipe.12-06-2012
20120311508Devices, Methods, and Graphical User Interfaces for Providing Accessibility Using a Touch-Sensitive Surface - An electronic device presents a first user interface element of a first type and a second user interface element of a second type. In a sighted mode, the device detects a first interaction with the first user interface element, and performs an operation in accordance with sighted-mode gesture responses for the first user interface element. The device detects a second interaction with the second user interface element, and performs an operation in accordance with sighted-mode gesture responses for the second user interface element. In an accessible mode, the device detects a third interaction with the first user interface element, and performs an operation in accordance with accessible-mode gesture responses for the first user interface element. The device detects a series of interactions with the second user interface element; and, for each interaction, performs an operation in accordance with the sighted-mode gesture responses for the second user interface element.12-06-2012
20110055775INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD AND INFORMATION PROCESSING PROGRAM - To present the operating state of a tracing operation in the event of performing a tracing operation to scroll objects of display.03-03-2011
20110055774SYSTEM AND METHOD FOR CONTROLLING INTERACTION BETWEEN A MOBILE TERMINAL AND A DIGITAL PICTURE FRAME - A mobile terminal includes a wireless communication unit, a memory, a touch screen, and a controller. The wireless communication unit establishes a connection to an external digital picture frame. The memory stores a plurality of images including one or more characters and information mapped to the characters. The touch screen displays a first image stored in the memory. And, the controller transmits the first image and first information mapped to the first image to the digital picture frame via the wireless communication unit.03-03-2011
20110055772SYSTEM AND METHOD FOR ENHANCED COMMAND INPUT - A portable electronic device having an input device for receiving a gesture based input from a user is used to control a navigation operation of an appliance. The portable electronic device receives via the input device the gesture based input and uses one or more parameters stored in a memory of the portable electronic device and one or more characteristics associated with the gesture based input to cause the portable electronic device to transmit a navigation step command to thereby control the navigation operation of the appliance.03-03-2011
20100333045Gesture Based Interaction with Traffic Data - Gesture based interaction with traffic data is disclosed. A virtual broadcast presentation may be generated based on dynamic information such as traffic information, weather information, or other information that may be featured on a virtual broadcast presentation. A gesture made by a user is detected and processed to determine an input command associated with the detected gesture. The virtual broadcast presentation may be manipulated based on the input command.12-30-2010
20080244468Gesture Recognition Interface System with Vertical Display - One embodiment of the invention includes a gesture recognition interface system. The system may comprise a substantially vertical surface configured to define a gesture recognition environment based on physical space in a foreground of the substantially vertical surface. The system may also comprise at least one light source positioned to provide illumination of the gesture recognition environment. The system also comprises at least two cameras configured to generate a plurality of image sets based on the illumination being reflected from an input object in the gesture recognition environment. The system further comprises a controller configured to determine a given input gesture based on changes in relative locations of the input object in each of the plurality of image sets. The controller may further be configured to initiate a device input associated with the given input gesture.10-02-2008
20110258586USER CONTROL - An apparatus including: at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus at least to: resolve a user input trace into a first displacement in a first direction and a second displacement in a second direction, orthogonal to the first direction; and control a position within a range in dependence upon both the first displacement and the second displacement.10-20-2011
20120124526METHOD FOR CONTINUING A FUNCTION INDUCED BY A MULTI-TOUCH GESTURE ON A TOUCHPAD - In a method for continuing a function induced by a multi-touch gesture on a touchpad, the object number of the multi-touch gesture is monitored during the function is performed, if the object number is detected changed so that one or more objects are still on the touchpad, the objects left on the touchpad will be detected to identify whether one or more of them move clockwise or anticlockwise, and if a clockwise or anticlockwise movement is detected, the function will be continued.05-17-2012
20120124525METHOD FOR PROVIDING DISPLAY IMAGE IN MULTIMEDIA DEVICE AND THEREOF - A display device includes a sensor to track movement of at least one body part of a person and a processor to compare an amount of the tracked movement to a reference value, recognize a position shift when the amount of tracked movement exceeds the reference value, and perform a predetermined function of the display device based on the position shift.05-17-2012
20110119638USER INTERFACE METHODS AND SYSTEMS FOR PROVIDING GESTURING ON PROJECTED IMAGES - Methods and systems enable a user to interact with a computing device by tracing a gesture on a surface with a laser beam. The computing device may be equipped with or coupled to a projector and a digital camera. The projector may project an image generated on the computing device on a projection surface which the camera images. Location and movement of a laser spot on the projection surface may be detected within received camera images. The projected image and the received camera image may be correlated so that the computing device can determine the location of a laser spot within the projected image. Movements of the laser spot may be correlated to predefined laser gestures which may be associated to particular functions that the computing device may implement. The functions may be similar to other user interface functionality. The function results may be displayed and projected.05-19-2011
20110029935METHOD AND APPARATUS FOR DETECTING UNDESIRED USERS USING SOCIALLY COLLABORATIVE FILTERING - In one embodiment, a method includes identifying at least one socially relevant gesture associated with a user and identifying at least one gesture graph that identifies content associated with at least one undesirable entity. The at least one socially relevant gesture is identified while the user is interacting with a system. The content includes a plurality of socially relevant gestures associated with the at least one undesirable entity. The method also includes determining when a distance between the at least one socially relevant gesture associated with the user and the content indicates that the user is undesirable, and processing the user as being undesirable when the distance indicates that the user is undesirable.02-03-2011
20110088003APPARATUS, METHODS AND COMPUTER-READABLE STORAGE MEDIA FOR SECURITY PROVISIONING AT A COMMUNICATION DEVICE - Apparatus, methods and computer-readable storage medium are provided for security provisioning at a communication device. In some embodiments, a method can include: executing a high security application on a communication device based, at least, on detecting that high security is enabled for the communication device and detecting execution of a low security application; outputting, via a user interface (UI), information configured to detect an entry to the communication device; detecting an entry at the UI of the communication device; determining whether the entry corresponds to security access information stored in the communication device; and providing access to the communication device based, at least, on determining that the entry corresponds to the security access information.04-14-2011
20110093822Image Navigation for Touchscreen User Interface - Various embodiments relate to a local computing device that includes a display and a touchscreen interface. The device is operable to establish a remote network computing session with a host computer system, transmit touch event information associated with touch events, receive graphical display information corresponding to a host image associated with the host computer system, translate the graphical display information from host coordinates to local coordinates, update the local image based on the graphical display information, the local image comprising a selected portion of the host image, and, in response to mouse movement events caused by associated touch events, change the selected portion of the host image while keeping a cursor in the center of the display, except when the center of the selected portion is within a predetermined limit of an edge of the host image, thereafter move the cursor relative to the local display.04-21-2011
20110093821DISPLAYING GUI ELEMENTS ON NATURAL USER INTERFACES - A computing system for displaying a GUI element on a natural user interface is described herein. The computing system includes a display configured to display a natural user interface of a program executed on the computing system, and a gesture sensor configured to detect a gesture input directed at the natural user interface by a user. The computing system also includes a processor configured to execute a gesture-recognizing module for recognizing a registration phase, an operation phase, and a termination phase of the gesture input, and a gesture assist module configured to first display a GUI element overlaid upon the natural user interface in response to recognition of the registration phase. The GUI element includes a visual or audio operation cue to prompt the user to carry out the operation phase of the gesture input, and a selector manipulatable by the user via the operation phase of the gesture.04-21-2011
20110093820GESTURE PERSONALIZATION AND PROFILE ROAMING - A gesture-based system may have default or pre-packaged gesture information, where a gesture is derived from a user's position or motion in a physical space. In other words, no controllers or devices are necessary. Depending on how a user uses his or her gesture to accomplish the task, the system may refine the properties and the gesture may become personalized. The personalized gesture information may be stored in a gesture profile and can be further updated with the latest data. The gesture-based system may use the gesture profile information for gesture recognition techniques. Further, the gesture profile may be roaming such that the gesture profile is available in a second location without requiring the system to relearn gestures that have already been personalized on behalf of the user.04-21-2011
20090300554Gesture Recognition for Display Zoom Feature - A method, apparatus, and system are disclosed that provide a computing user with an ability to engage in a multitude of operations via the entry of gestures. Computer operations may be mapped to shapes, and a comparison may take place between a user-entered gesture and the shapes to determine whether the gesture approximates at least one of the shapes. Responsive to determining that the gesture approximates at least one of the shapes, an operation associated with the shape may be executed. The operation may include a zoom operation (e.g., a zoom-in or a zoom-out operation), wherein the dimensions of the gesture may influence content to be included in an updated display. Additional adjustments may be performed to improve a resolution associated with the content included in the updated display.12-03-2009
20100185990Movable display apparatus, robot having movable display apparatus and display method thereof - A movable display apparatus, a robot having the movable display apparatus, and a display method thereof, and, more particularly, a display method of a robot having an apparatus to display an image according to a visual point of a user are provided. It is possible to provide a convenient extended image service, by movably mounting the apparatus to display the image according to the visual point of the user and mounting the movable display apparatus in the robot so as to accurately display the image according to the visual point of the user using the mobility and motion of the robot. In addition, it is possible to provide an image, which is viewed like a three-dimensional image, via a two-dimensional display apparatus, by changing the displayed image according to a variation in the sightline of the user.07-22-2010
20110191724DEVICE FOR ITEM CONTROL, SYSTEM FOR ITEM CONTROL, AND METHOD - A first device classifies and displays an item, identifies a suitable class matched to approach information of a second device out of the entire area of the classified item as the second device approaches the first device, and provides the second device with the identified class or executes a service linked to the class. The second device approaches a portion where a desired class is displayed by the first device, receives the class from the first device, and provides a linked service using the same.08-04-2011
20100031202USER-DEFINED GESTURE SET FOR SURFACE COMPUTING - The claimed subject matter provides a system and/or a method that facilitates generating an intuitive set of gestures for employment with surface computing. A gesture set creator can prompt two or more users with a potential effect for a portion of displayed data. An interface component can receive at least one surface input from the user in response to the prompted potential effect. A surface detection component can track the surface input utilizing a computer vision-based sensing technique. The gesture set creator collects the surface input from the two or more users in order to identify a user-defined gesture based upon a correlation between the respective surface inputs, wherein the user-defined gesture is defined as an input that initiates the potential effect for the portion of displayed data.02-04-2010
20100031203USER-DEFINED GESTURE SET FOR SURFACE COMPUTING - The claimed subject matter provides a system and/or a method that facilitates generating an intuitive set of gestures for employment with surface computing. A gesture set creator can prompt two or more users with a potential effect for a portion of displayed data. An interface component can receive at least one surface input from the user in response to the prompted potential effect. A surface detection component can track the surface input utilizing a computer vision-based sensing technique. The gesture set creator collects the surface input from the two or more users in order to identify a user-defined gesture based upon a correlation between the respective surface inputs, wherein the user-defined gesture is defined as an input that initiates the potential effect for the portion of displayed data.02-04-2010
20100031200METHOD OF INPUTTING A HAND-DRAWN PATTERN PASSWORD - A method of inputting a hand-drawn pattern password includes the steps of providing a plurality of input keys on a touch panel; corresponding each of the input keys to a unique character, numeral, or a symbol; and causing a user to sequentially touch a sequence of some of the input keys on the touch panel to thereby draw a user-remembered pattern password, so that a sequence of characters, numerals, and/or symbols corresponding to the input keys for drawing the pattern password constitutes an effective password and are input.02-04-2010
20100023895Touch Interaction with a Curved Display - Touch interaction with a curved display (e.g., a sphere, a hemisphere, a cylinder, etc.) is facilitated by preserving a predetermined orientation for objects. In an example embodiment, a curved display is monitored to detect a touch input on an object. If a touch input on an object is detected based on the monitoring, then one or more locations of the touch input are determined. The object may be manipulated responsive to the determined one or more locations of the touch input. While manipulation of the object is permitted, a predetermined orientation is preserved.01-28-2010
20100017759Systems and Methods For Physics-Based Tactile Messaging - Systems and methods for physics-based tactile messaging are disclosed. For example, one disclosed method includes the steps of receiving a sensor signal from a sensor configured to sense a physical interaction with a messaging device; determining an interaction between one or more virtual message objects and a virtual message environment, the interaction based at least in part on the sensor signal and a virtual physical parameter of at least one of the one or more virtual message objects; and determining a haptic effect based at least in part on the interaction. The method additionally includes the step of generating a haptic signal configured to cause an actuator to output the haptic effect.01-21-2010
20110307843Information Processing Apparatus, Operation Method, and Information Processing Program - An information processing apparatus includes an operation unit; and a control unit performing a process in response to an operation executed through the operation unit. Different gesture operations are able to be assigned to an operation corresponding to copy of information and an operation corresponding to cut of information, respectively. The control unit selects a portion designated by a user in information displayed on a display unit, and then copies the selected portion when the user executes the gesture operation corresponding to the copy through the operation unit, whereas the control unit cuts the selected portion when the user executes the gesture operation corresponding to the cut through the operation unit.12-15-2011
20110307842ELECTRONIC READING DEVICE - This invention provides an electronic reading device which comprises an eye glass frame and a camera-projection component mounted on the eye glass frame comprising a projection unit to project an image onto a projection surface and an optical sensor unit to perform a scan of a region near the projection surface, wherein the optical sensor unit is configured to operate as a user interface by detecting a user input based on the scan. The electronic reading device could create the same page-reading experience as well as page-turning experience on to every surface such as walls, tables, or other kinds of panels as reading a conventional book, i.e. simulating “page turning like a real paper book”.12-15-2011
20110307841METHOD AND APPARATUS FOR BINDING USER INTERFACE ELEMENTS AND GRANULAR REFLECTIVE PROCESSING - An approach is provided for binding user interface elements and granular reflective processing. An information management infrastructure determines to detect an event, from a first device, for specifying one or more user interface elements for transfer to a second device. The information management infrastructure further identifies one or more processes bound to the user interface elements. The information management infrastructure also determines at least one of a user context, an execution context within the user context, and one or more other execution contexts for the processes, wherein the one or more other execution contexts are from at least one of the user context and one or more other user contexts. The information management infrastructure further causes, at least in part, serialization of at least one of the user context, the execution context, and the one or more other execution contexts. The information management infrastructure further determines to transmit the serialization to the second device to initiate reconstruction of the at least one of the user context, the execution context, and the one or more other execution contexts.12-15-2011
20090172606METHOD AND APPARATUS FOR TWO-HANDED COMPUTER USER INTERFACE WITH GESTURE RECOGNITION - A method and apparatus for manipulating displayed content using first and second types of human-machine interface in combination are disclosed. Machine operations are divided into two sets and the first type of user interface controls a first set and a second set of operations, while the second type of user interface controls only the second set. In a preferred method embodiment, one hand controls the first set via a mouse interface and the other hand controls the second set via a stereo camera based hand gesture recognition interface. In a preferred apparatus embodiment, the apparatus has a manipulable input device capable of interacting with displayed content and visualization of the displayed content. Additionally, the apparatus has a gesture based input device capable of interacting only with the visualization of the displayed content.07-02-2009
20110314428DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A display apparatus includes: a display unit; a user interface (UI) generator which generates UI information to be displayed on the display unit; a user input unit which includes a touch pad; and a controller which controls the UI generator to display a plurality of selective items per item page on the display unit, to move a focus in accordance with a touch motion of the user, and to stop moving the focus if the focus is located at an outermost selective item positioned at an edge of the item page.12-22-2011
20110314426RISK-BASED ALERTS - Some embodiments provide a system that facilitates use of a computer system. During operation, the system obtains notification of a risk associated with a user action on the computer system. Next, the system generates an alert within a user interface based at least on a severity of the risk. The alert may include a set of user-interface elements representing an effect of the user action. The system then receives a response to the alert from a user of the computer system. The response may include a dragging of a first of the user-interface elements in one or more directions to a second of the user-interface elements. Finally, the system processes the user action based at least on the response.12-22-2011
20120042288SYSTEMS AND METHODS FOR INTERACTIONS WITH DOCUMENTS ACROSS PAPER AND COMPUTERS - Systems and methods provide for mixed use of physical documents and a computer, and more specifically provide for detailed interactions with fine-grained content of physical documents that are integrated with operations on a computer to provide for improved user interactions between the physical documents and the computer. The system includes a camera which processes the physical documents and detects gestures made by a user with respect to the physical documents, a projector which provides visual feedback on the physical document, and a computer with a display to coordinate the interactions of the user with the computer and the interactions of the user with the physical document. The system, which can be portable, is capable of detecting interactions with fine-grained content of the physical document and translating interactions at the physical document with the computer display, and vice versa.02-16-2012
20120072873TRANSPARENT DISPLAY DEVICE AND METHOD FOR PROVIDING OBJECT INFORMATION - According to an embodiment of the present invention, a method for providing object information includes determining an eye direction of a person toward a first region of a transparent display, selecting at least one object seen via the transparent display in the determined eye direction, acquiring information on the selected object, and displaying the acquired information on the transparent display.03-22-2012
20100125816MOVEMENT RECOGNITION AS INPUT MECHANISM - The detection of relative motion or orientation between a user and a computing device can be used to control aspects of the device. For example, the computing device can include an imaging element and software for locating positions, shapes, separations, and/or other aspects of a user's facial features relative to the device, such that an orientation of the device relative to the user can be determined. A user then can provide input to the device by performing actions such as tilting the device, moving the user's head, making a facial expression, or otherwise altering an orientation of at least one aspect of the user with respect to the device. Such an approach can be used in addition to, or as an alternative to, conventional input devices such as keypads and touch screens.05-20-2010
20120047470METHOD AND APPARATUS FOR BROWSING AN ELECTRONIC BOOK ON A TOUCH SCREEN DISPLAY - A method is disclosed for navigating in an electronic book comprising a plurality of pages, the method comprising displaying in an interface a first given page of the electronic book with a fore edge section representative of a fore edge of the electronic book; detecting a finger motion in the fore edge section and displaying a second given page of the electronic book wherein the second given page is displayed depending on characteristics of the detected finger motion in the fore edge section.02-23-2012
20120047469METHOD AND APPARATUS FOR ADAPTING A CONTENT PACKAGE COMPRISING A FIRST CONTENT SEGMENT FROM A FIRST CONTENT SOURCE TO DISPLAY A SECOND CONTENT SEGMENT FROM A SECOND CONTENT SOURCE - An apparatus may include a user interface configured to display a content package including a first content segment from a first content source. A gesture interpreter may be configured to receive a gesture input in a positional relationship to the first content segment. The apparatus may further include a content relationship manager which may be configured to determine relationships between content segments such that a content segment selector may select a second content segment relating to the first content segment from a second content source. Further, the apparatus may include a content package adaptor configured to adapt the content package to provide for display of the second content segment. In some instances the content package adaptor may adapt the content package by providing for display of a second content package, for example from a different application than an application which the first content segment is from.02-23-2012
20120047468Translating User Motion Into Multiple Object Responses - A system for translating user motion into multiple object responses of an on-screen object based on user interaction of an application executing on a computing device is provided. User motion data is received from a capture device from one or more users. The user motion data corresponds to user interaction with an on-screen object presented in the application. The on-screen object corresponds to an object other than an on-screen representation of a user that is displayed by the computing device. The user motion data is automatically translated into multiple object responses of the on-screen object. The multiple object responses of the on-screen object are simultaneously displayed to the users.02-23-2012
20120005632EXECUTE A COMMAND - A method for executing a command including detecting a gesture from a user with a sensor, identifying the gesture and a command associated with the gesture, and identifying at least one corresponding device to execute the command on and configuring a device to execute the command on at least one of the corresponding devices.01-05-2012
20110167391USER INTERFACE METHODS AND SYSTEMS FOR PROVIDING FORCE-SENSITIVE INPUT - Methods and systems implement touch sensors or force sensitive materials disposed on the case of a computing device in order to enable user input gestures to be performed on portions of the device case. The force sensitive elements may generate an electrical signal in response to a gesture, such as a tap, squeeze, swipe or twist. The properties of the generated electrical signal may be compared to various reference templates to recognize particular input gestures. The force sensitive elements may operate in conjunction with more traditional input methods, such as touch-screen display and electromechanical buttons. By enabling user input gestures on the case of computing devices, the various aspects permit one hand operation of the devices including intuitive gestures that do not require the users focused attention to accomplish. Thus the various aspects may enable users to utilize their computing devices in situations not suitable to conventional user input technologies.07-07-2011
20120011476ELECTRONIC DEVICE AND METHOD FOR SEARCHING MULTIMEDIA FILE - An electronic device includes a touch screen, a memory module, a dividing module, an identifying module, a sorting module, and a driving module. The memory module saves multimedia files (MFs). Each MF has four tags, each tag corresponds to a category. The dividing module drives the touch screen to display areas that makes up a grid. The grid has reference lines: a horizontal line, a vertical line, a first diagonal line, and a second diagonal line. The identifying module identifies a vector of a user's slide on the touch screen that is substantially parallel to a reference line. The sorting module receives the vector identified by the identifying module and generates a list making up of the tags of the same category that corresponds to the vector. The driving module reads the list and drives the areas corresponding to the vector to display the tags of the list.01-12-2012
20120023461APPLICATION PROGRAMMING INTERFACES FOR GESTURE OPERATIONS - At least certain embodiments of the present disclosure include an environment with user interface software interacting with a software application to provide gesture operations for a display of a device. A method for operating through an application programming interface (API) in this environment includes transferring a scaling transform call. The gesture operations include performing a scaling transform such as a zoom in or zoom out in response to a user input having two or more input points. The gesture operations also include performing a rotation transform to rotate an image or view in response to a user input having two or more input points.01-26-2012
20120023459SELECTIVE REJECTION OF TOUCH CONTACTS IN AN EDGE REGION OF A TOUCH SURFACE - The selective rejection of touch contacts in an edge region of a touch sensor panel is disclosed. In addition, by providing certain exceptions to the rejection of edge contacts, the functionality of the touch sensor panel can be maximized. Contacts in edge bands around the perimeter of a touch sensor panel can be ignored. However, if a contact in the edge band moves beyond a threshold distance or speed, it can be recognized as part of a gesture. To accommodate different finger sizes, the size of the edge band can be modified based on the identification of the finger or thumb. Furthermore, if contacts in the center region of a touch sensor panel track the movement of contacts in the edge band, the contacts in the edge band can be recognized as part of a gesture.01-26-2012
20120023457PRESENTATION OF ADVERTISEMENTS BASED ON USER INTERACTIVITY WITH A WEB PAGE - Methods and systems for presenting advertisements based on user interactivity with a web page are provided. According to embodiments of the invention, a web page is rendered on a client device. Gesture interactivity with the web page is monitored on the client device. A trigger is executed which defines an interactive event. When the interactive event occurs, as determined based on the monitored gesture interactivity with the web page, secondary content, such as an advertisement, is downloaded and displayed on the client device.01-26-2012
20120023456INTERACTIVE IMAGE MATTING - A user interface enables interactive image matting to be performed on an image The user interface may provide results including an alpha matte as feedback in real time. The user interface may provide interactive tools for selecting a portion of the image, and an unknown region for alpha matte processing may be automatically generated adjacent to the selected region. The user may interactively refine the alpha matte as desired to obtain a satisfactory result.01-26-2012
20120060129MOBILE TERMINAL HAVING TOUCH SCREEN AND METHOD FOR DISPLAYING CONTENTS THEREIN - A mobile terminal having a touch screen and a method for displaying contents therein are provided. The method for displaying contents in a mobile terminal having a touch screen includes determining whether a touch action moves when the touch action is sensed on displayed contents, calculating a physical display change amount for changing and displaying the contents according to the touch action when the touch moves, and continuously changing and displaying the contents according to the physical calculated display change amount when the touch action stops.03-08-2012
20120060128DIRECT, GESTURE-BASED ACTIONS FROM DEVICE'S LOCK SCREEN - Embodiments enable a mobile device to execute an action analogous to a user-defined action in response to receipt of a gesture analogous to a user-defined gesture. In a first embodiment, a computer-implemented method executes an action on a mobile device. A lock screen view is displayed on the mobile device to prevent unauthorized and inadvertent access to the mobile device's data. While the mobile device is locked, a touch gesture having a pre-defined shape is detected on a touch screen of the mobile device independently of the initial position of the touch gesture on the touch screen. In response to detection of the touch gesture, a particular action is executed on the mobile device while the mobile device stays locked. The particular action determined according to the pre-defined shape. In this way, detection of the touch gesture causes the particular action to execute while keeping the mobile device locked.03-08-2012
20120159404DETECTING VISUAL GESTURAL PATTERNS - A processing device and method are provided for capturing images, via an image-capturing component of a processing device, and determining a motion of the processing device. An adaptive search center technique may be employed to determine a search center with respect to multiple equal-sized regions of an image frame, based on previously estimated motion vectors. One of several fast block matching methods may be used, based on one or more conditions, to match a block of pixels of one image frame with a second block of pixels of a second image. Upon matching blocks of pixels, motion vectors of the multiple equal-sized regions may be estimated. The motion may be determined, based on the estimated motion vectors, and an associated action may be performed. Various embodiments may implement techniques to distinguish motion blur from de-focus blur and to determine a change in lighting condition.06-21-2012
20120159401Workspace Manipulation Using Mobile Device Gestures - Workspaces are manipulated on a mobile device having a display screen. A set of two or more discrete workspaces is established. A default discrete workspace is then displayed on the screen, where the default discrete workspace is one of the discrete workspaces in the set. Whenever a user gestures with the mobile device, the gesture is used to select one of the discrete workspaces from the set, and the selected discrete workspace will be displayed on the screen.06-21-2012
20120159403SYSTEM AND METHOD FOR GAUGING AND SCORING AUDIENCE INTEREST OF PRESENTATION CONTENT - A system and method for managing content displayed in media presentations. A client computing device includes a client media presentation application that is responsive to feedback input received during the display of one or more slides of a media presentation to assign a score to each of the one or more slides. The media presentation application stores the assigned scores in a client database as media presentation data. A server computing device includes a server database that stores media presentation data. The server computing device also includes a server media presentation application that communicates with the client media presentation application during a synchronization process to synchronize media presentation data between the client database the server database.06-21-2012
20120159402METHOD AND APPARATUS FOR PROVIDING DIFFERENT USER INTERFACE EFFECTS FOR DIFFERENT IMPLEMENTATION CHARACTERISTICS OF A TOUCH EVENT - A method for providing a mechanism by which different user interface effects may be performed for different classifications of gestures may include receiving an indication of a touch event at a touch screen display, determining a gesture classification and an implementation characteristic classification for the touch event, and enabling provision of substantively different user interface effects for respective different implementation characteristic classifications of the gesture classification. A corresponding apparatus and computer program product are also provided.06-21-2012
20120110520DEVICE FOR USING USER GESTURE TO REPLACE EXIT KEY AND ENTER KEY OF TERMINAL EQUIPMENT - A device for using user gesture to replace the exit key and the enter key of a terminal equipment, comprising a CPU module, a gesture input module, a gesture processing module, a terminal application module, a memory module and a terminal function module. The CPU module can be connected with the gesture input module, the gesture processing module, the terminal application module, the memory module and the terminal function module, and can receive the user gesture input information sent by the gesture input module, the setting content information sent by the terminal application module, and the gesture identifying information sent by the gesture processing module. The CPU module can exit with or without saving from the received setting content information based on the gesture identifying information. The device increases the viewable area of the user and simplifies the human-machine interaction process.05-03-2012
20120110518TRANSLATION OF DIRECTIONAL INPUT TO GESTURE - A user device is disclosed which includes a touch input and a keypad input. The user device is configured to operate in a gesture capture mode as well as a navigation mode. In the navigation mode, the user interfaces with the touch input to move a cursor or similar selection tool within the user output. In the gesture capture mode, the user interfaces with the touch input to provide gesture data that is translated into key code output having a similar or identical format to outputs of the keypad.05-03-2012
20120110517METHOD AND APPARATUS FOR GESTURE RECOGNITION - A touchscreen device is configured to display a number of user interface elements in accordance with a menu hierarchy. Upon receipt of a predetermined touchscreen gesture (e.g., the circular motion of a manipulator) the menu hierarchy is bypassed and the user is given immediate control over a selected function, for example, a tuning function such as audio volume, screen contrast, and the like.05-03-2012
20120110519GRAPHICAL MANIPULATION OF DATA OBJECTS - In an embodiment, a user input defining an enclosed, graphical shape on a video display is received. A number of graphical items are identified as being included within the enclosed, graphical shape. Here, each graphical item is displayed on the video display and represents a data object that has a number of properties. A property is extracted from the number of properties that the data objects have in common based on the identification. A number of other manipulation techniques are also described.05-03-2012
20120110516POSITION AWARE GESTURES WITH VISUAL FEEDBACK AS INPUT METHOD - A gesture based user interface is provided for a user to interact with a device in order to operate and control the device through detection of gestures and movements of the user. Visual feedback of the user gestures is provided to the user to aid in the user's operational and control decisions of a device. An image capturing device such as a video camera may be employed to capture a user's image, and an integrated application on a computing device may process continuous images from the capturing device to recognize and track user gestures. The gestures may correlate to an object and/or location on the display and the user's image may be projected on the display to provide visual feedback of the user's interaction.05-03-2012
20120254808HOVER-OVER GESTURING ON MOBILE DEVICES - Aspects of the disclosure may relate to detecting, by a computing device, a first user input comprising a first gesture to interact with a touch-sensitive screen of the computing device. Aspects may also include detecting a second user input comprising a second gesture using the touch-sensitive screen of the computing device. Aspects may also include, responsive to detecting the first user input, initiating a hover mode of interaction in a graphical user interface.10-04-2012
20120254809METHOD AND APPARATUS FOR MOTION GESTURE RECOGNITION - Various methods for motion gesture recognition are provided. One example method may include receiving motion gesture test data that was captured in response to a user's performance of a motion gesture. The motion gesture test data may include acceleration values in each of three dimensions of space that have directional components that are defined relative to an orientation of a device. The example method may further include transforming the acceleration values to derive transformed values that are independent of the orientation of the device, and performing a comparison between the transformed values and a gesture template to recognize the motion gesture performed by the user. Similar and related example methods, example apparatuses, and example computer program products are also provided.10-04-2012
20110107276Icon/text interface remote controller - An icon/text interface remote controller and particularly a remote controller incorporating with a display device which displays an icon/text menu interface to allow users to make selection mainly includes a touch panel, a button switch, a power supply, a power control circuit and a wireless emission circuit that incorporate with the display device. Users can see options of the icon/text menu on the display device to do remote control operation in a more user-friendly fashion.05-05-2011
20120317521General User Interface Gesture Lexicon and Grammar Frameworks for Multi-Touch, High Dimensional Touch Pad (HDTP), Free-Space Camera, and Other User Interfaces - A method for a multi-touch gesture-based user interface wherein a plurality of gestemes are defined as functions of abstract space and time and further being primitive gesture segments that can be concatenated over time and space to construct gestures. Various distinct subset of the gestemes can be concatenated in space and time to construct a distinct gestures. Real-time multi-touch gesture-based information provided by user interface is processed to at least a recognized sequence of specific gestemes and that the sequence of gestemes that the user's execution a gesture has been completed. The specific gesture rendered by the user is recognized according to the sequence of gestemes. Many additional features are then provided from this foundation, including gesture grammars, structured-meaning gesture-lexicon, context, and the use of gesture prosody.12-13-2012
20090037849Apparatus, methods, and computer program products providing context-dependent gesture recognition - At least some exemplary embodiments of the invention enable the use of context-dependent gestures, for example, in order to assist in the automation of one or more tasks. In one exemplary embodiment, an apparatus senses a predefined gesture and, in conjunction with context information (e.g., location information), performs a predefined action in response to the gesture. As non-limiting examples, the gesture may involve movement of the apparatus (e.g., shaking, tapping) or movement relative to the apparatus (e.g., using a touch screen). In one exemplary embodiment of the invention, a method includes: obtaining context information for an apparatus, wherein the context information includes a predefined context; and in response to sensing a predefined movement associated with the predefined context, performing, by the apparatus, a predefined action, wherein the predefined movement includes a movement of or in relation to the apparatus.02-05-2009
20120167017SYSTEMS AND METHODS FOR ADAPTIVE GESTURE RECOGNITION - Systems and methods are described for adaptively recognizing gestures indicated by user inputs received from a touchpad, touchscreen, directional pad, mouse or other multi-directional input device. If a user's movement does not indicate a gesture using current gesture recognition parameters, additional processing can be performed to recognize the gesture using other factors. The gesture recognition parameters can then be adapted so that subsequent user inputs that are similar to the previously-rejected inputs will appropriately trigger gesture commands as desired by the user. Gestural data or parameters may be locally or remotely stored for further processing.06-28-2012
20120131519Surfacing Off-Screen Visible Objects - A computer-implemented user input process for a computing device includes receiving, on a touch pad surface over a graphical display, a user input motion dragging across the touch pad surface, identifying the dragging input motion as originating off an edge of the touch pad by identifying a sensed first location for the input motion at a peripheral edge of the touch pad surface, and displaying on the graphical display a sliding graphical element that is animated to move from the edge of the display into a body of the display, over a nonmoving element on the display, in response to identifying the dragging input motion.05-24-2012
20120131514Gesture Recognition - Gesture recognition is described. In one example, gestures performed by a user of an input device having a touch-sensitive portion are detected using a definition of a number of regions corresponding to zones on the touch-sensitive portion, each region being associated with a distinct set of gestures. Data describing movement of the user's digits on the touch-sensitive portion is received, and an associated region for the data determined. The data is compared to the associated region's set of gestures, and a gesture applicable to the data selected. A command associated with the selected gesture can then be executed. In an example, comparing the data to the set of gestures comprises positioning a threshold for each gesture relative to the start of the digit's movement. The digit's location is compared to each threshold to determine whether a threshold has been crossed, and, if so, selecting the gesture associated with that threshold.05-24-2012
20120131513Gesture Recognition Training - Gesture recognition training is described. In an example, a gesture recognizer is trained to detect gestures performed by a user on an input device. Example gesture records, each showing data describing movement of a finger on the input device when performing an identified gesture are retrieved. A parameter set that defines spatial triggers used to detect gestures from data describing movement on the input device is also retrieved. A processor determines a value for each parameter in the parameter set by selecting a number of trial values, applying the example gesture records to the gesture recognizer with each trial value to determine a score for each trial value, using the score for each trial value to estimate a range of values over which the score is a maximum, and selecting the value from the range of values.05-24-2012
20120131518APPARATUS AND METHOD FOR SELECTING ITEM USING MOVEMENT OF OBJECT - An item selecting apparatus includes a movement detecting unit detecting a movement of a user, a screen displaying image information, a display image storage unit storing data to generate an image to be displayed on the screen, a display image generating unit generating an image to be displayed on the screen, and a control unit controlling so that a plurality of items is displayed on the screen in one of one-, two- and three-dimensional arrangements, in which the control unit receives a signal from the movement detecting unit to measure a movement of the object in at least one of x-, y- and z-axis directions and issues a command to select at least one from among the plurality of items or provides visual feedback thereto, in response to the measured movement of the user and in accordance with the arrangement of the plurality of items on the screen.05-24-2012
20120131517INFORMATION PROCESSING APPARATUS AND OPERATION METHOD THEREOF - There is provided an information processing apparatus with an interface with high convenience for a user. A reference speed is set according to an amount of movement or a movement time period of a pointer of a stylus or a finger. It is determined based on a movement speed of the pointer and the reference speed that a flick operation with the pointer has occurred.05-24-2012
20120131516METHOD, SYSTEM AND COMPUTER READABLE MEDIUM FOR DOCUMENT VISUALIZATION WITH INTERACTIVE FOLDING GESTURE TECHNIQUE ON A MULTI-TOUCH DISPLAY - A method, system and computer readable medium for folding a document page object are provided. A method for folding a document page object in a graphical user interface using multi-touch gestures includes establishing at least two contact points on a display; moving at least one of the two contact points to create a fold on the document page object; and displaying the folded document page object.05-24-2012
20120131520Gesture-based Text Identification and Selection in Images - A device with a touch-sensitive screen supports tapping gestures for identifying, selecting or working with initially unrecognized text. A single tap gesture can cause a portion of a character string to be selected. A double tap gesture can cause the entire character string to be selected. A tap and hold gesture can cause the device to enter a cursor mode wherein a placement of a cursor relative to the characters in a character string can be adjusted. In a text selection mode, a finger can be used to move the cursor from a cursor start position to a cursor end position and to select text between the positions. Selected or identified text can populate fields, control the device, etc. Recognition of text can be performed upon access of an image or upon the device detecting a tapping gesture in association with display of the image on the screen.05-24-2012
20120131515METHOD AND APPARATUS OF ERROR CORRECTION IN RESISTIVE TOUCH PANELS - A method and apparatus of generating display data based on at least one input display value received from a touch screen device is disclosed. According to one example method of operation the method may include receiving an input display value in a predefined enclosed area of an input domain, and calculating a parametric representation of the received input display value based on the boundaries of the predefined enclosed area in the input domain. The predefined enclosed area may be a triangle. The operations may also include mapping the parametric representation of the input display value to a corresponding output display value in an output domain. The operations may also include displaying the at least one output display value via the display device.05-24-2012
20120216154TOUCH GESTURES FOR REMOTE CONTROL OPERATIONS - In general, this disclosure describes techniques for providing a user of a first computing device (e.g., a mobile device) with the ability to utilize the first computing device to control a second computing device (e.g., a television). Specifically, the techniques of this disclosure may, in some examples, allow the user to use drawing gestures on a mobile computing device to remotely control and operate the second computing device. Using a presence-sensitive user interface device (e.g., a touch screen), the user may use drawing gestures to indicate characters associated with operations and commands to control the second computing device.08-23-2012
20120137258MOBILE ELECTRONIC DEVICE, SCREEN CONTROL METHOD, AND STORAGE MEDIUM STORING SCREEN CONTROL PROGRAM - According to an aspect, a mobile electronic device includes a touch panel and a control unit. The touch panel displays a screen thereon and detects a gesture performed on a surface thereof. When a sweep gesture is detected by the touch panel, the control unit cause an object corresponding to a process, which is executable while the screen is displayed on the touch panel, to be displayed near a position where the sweep gesture is detected on the touch panel.05-31-2012
20110185320Cross-reference Gestures - Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.07-28-2011
20110185319VIRTUAL PIN PAD FOR FUEL PAYMENT SYSTEMS - A method and system for displaying a virtual PIN pad in varying locations on a touch screen in order to prevent fraud or the interception of personal identification numbers.07-28-2011
20110185318EDGE GESTURES - Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.07-28-2011
20110185316Device, Method, and Graphical User Interface for Managing User Interface Content and User Interface Elements - Alignment guides configured for velocity-sensitive behavior are disclosed. In one embodiment, during a user interface element move gesture, the gesture velocity is determined, and while moving the user interface element during the gesture, the user interface operates in a first or a second state with respect to displaying alignment guides. When the velocity of the user gesture exceeds a predefined velocity threshold, the display of the user interface is maintained in the first state, which does not include visibly displaying alignment guides. When the velocity of the user gesture is less than the predefined velocity threshold, the user interface is displayed in a second state that includes visibly displaying one or more alignment guides. In some embodiments, gesture velocity is used to set alignment guide attraction strength.07-28-2011
20120137259ASSOCIATED FILE - A method for accessing a file on a computing machine including configuring a sensor to detect an object and a user interacting with the object, associating the object with at least one file on the computing machine, and configuring a display device to render an associated file being accessed in response to the user interacting with the object.05-31-2012
20120216153HANDHELD DEVICES, ELECTRONIC DEVICES, AND DATA TRANSMISSION METHODS AND COMPUTER PROGRAM PRODUCTS THEREOF - Data transmission methods for handheld devices are provided. The data transmission method includes the steps of: receiving a gesture input; determining whether the gesture input matches a predetermined gesture; and if so, obtaining directional information corresponding to the gesture input and transmitting a file and the directional information to at least one electronic device such that display of a user interface of the at least one electronic device generates a display effect corresponding to the gesture according to the directional information.08-23-2012
20120216152TOUCH GESTURES FOR REMOTE CONTROL OPERATIONS - In general, this disclosure describes techniques for providing a user of a first computing device (e.g., a mobile device) with the ability to utilize the first computing device to control a second computing device (e.g., a television). Specifically, the techniques of this disclosure may, in some examples, allow the user to use drawing gestures on a mobile computing device to remotely control and operate the second computing device. Using a presence-sensitive user interface device (e.g., a touch screen), the user may use drawing gestures to indicate characters associated with operations and commands to control the second computing device.08-23-2012
20120254810Combined Activation for Natural User Interface Systems - A user interaction activation may be provided. A plurality of signals received from a user may be evaluated to determine whether the plurality of signals are associated with a visual display. If so, the plurality of signals may be translated into an agent action and a context associated with the visual display may be retrieved. The agent action may be performed according to the retrieved context and a result associated with the performed agent action may be displayed to the user.10-04-2012
20120174043GESTURE-BASED SELECTION - Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for gesture-based selection. In one aspect, a method includes displaying, on a user interface, information referencing a first set of data items and at least one control that references multiple values. A first gesture input by a user using the control is identified, and a first value is selected based on the first gesture. A second gesture input by the user using the control is identified, and a second value is selected based on the second gesture. A second set of data items is selected based on the first and second values, and information referencing the second set of data items is displayed.07-05-2012
20120174041GESTURE-BASED SELECTION - Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for gesture-based selection. In one aspect, a method includes displaying, on a user interface, information referencing a first set of data items, and at least one control that references multiple values, identifying a first gesture input by a user using the control, selecting a first value based on the first gesture, identifying a second gesture input by the user using the control, selecting a second value based on the second gesture, selecting a second set of data items to display based on the first and second values, and displaying information referencing the second set of data items.07-05-2012
20100050134ENHANCED DETECTION OF CIRCULAR ENGAGEMENT GESTURE - The enhanced detection of a circular engagement gesture, in which a shape is defined within motion data, and the motion data is sampled at points that are aligned with the defined shape. It is determined whether a moving object is performing a gesture correlating to the defined shape based on a pattern exhibited by the sampled motion data. An application is controlled if determining that the moving object is performing the gesture.02-25-2010
20100050133Compound Gesture Recognition - One embodiment of the invention includes a method for executing and interpreting gesture inputs in a gesture recognition interface system. The method includes detecting and translating a first sub-gesture into a first device input that defines a given reference associated with a portion of displayed visual content. The method also includes detecting and translating a second sub-gesture into a second device input that defines an execution command for the portion of the displayed visual content to which the given reference refers.02-25-2010
20120180001ELECTRONIC DEVICE AND METHOD OF CONTROLLING SAME - A method includes detecting a gesture associated with an edge of a display, determining an element associated with the edge, and opening the element.07-12-2012
20120180002NATURAL INPUT FOR SPREADSHEET ACTIONS - Different gestures and actions are used to interact with spreadsheets. The gestures are used in manipulating the spreadsheet and performing other actions in the spreadsheet. For example, gestures may be used to move within the spreadsheet, select data, filter, sort, drill down/up, zoom, split rows/columns, perform undo/redo actions, and the like. Sensors that are associated with a device may also be used in interacting with spreadsheets. For example, an accelerometer may be used for moving and performing operations within the spreadsheet.07-12-2012
20100275166USER ADAPTIVE GESTURE RECOGNITION METHOD AND USER ADAPTIVE GESTURE RECOGNITION SYSTEM - The present invention relates to a user adaptive gesture recognition method and a user adaptive gesture recognition system. The present invention relates to a user adaptive gesture recognition method and a user adaptive gesture recognition system that, by using a terminal equipped with an acceleration sensor, can drive mobile application software in the terminal or can process a function of an application program for browsing to be displayed on the terminal based on acceleration information. Accordingly, the user gesture can be recognized and processed by using an acceleration sensor installed in a mobile apparatus. In addition, the user adaptive gesture can be stored in the mobile apparatus by using the acceleration sensor, and thus a mobile application can be easily utilized with a simple gesture.10-28-2010
20120180003IMAGE FORMING APPARATUS AND TERMINAL DEVICE EACH HAVING TOUCH PANEL - MFP as an image forming apparatus includes a touch panel, a controller connected to the touch panel, a memory for storing a data file, and a communication device for communicating with another device. Continuously after two contacts are made on the touch panel, when a gesture of moving the two contacts in a direction that a spacing therebetween is decreased and then releasing the two contacts after being moved is detected, the controller identifies a file presented by an icon displayed in an area defined by the two contacts at least either of before and after being moved, as a file to be transferred, and transfers the file to be transferred to the other device in response to a request to transfer a file from the other device received by the communication device.07-12-2012
20120124527PORTABLE ELECTRONIC DEVICE, AND CONTROL METHOD AND CONTROL PROGRAM FOR THE SAME - An object of the present invention is to provide a portable electronic device capable of performing character inputs of which usability is improved for users, and to provide a control method and a control program for the portable electronic device. Depending on a predetermined operation detected by a detecting unit, a character input control unit rotates a around area around a touched character “na”, and depending on rotation of the around area, the character input control unit cancels display of some characters of related characters based on a predetermined order, and displays new related characters related to the touched character “na” in the around area.05-17-2012
20120317522COMPUTING DEVICE AND METHOD FOR SELECTING DISPLAY REGIONS RESPONSIVE TO NON-DISCRETE DIRECTIONAL INPUT ACTIONS AND INTELLIGENT CONTENT ANALYSIS - A computing device includes a display surface, a human interface feature, and processing resources. The human interface features enables a user of the computing device to enter a non-discrete directional input action. The processing resources execute to: (i) provide content on the display surface; (ii) detect the user performing the input action; (ii) determine a vector from the input action; and (iv) select a region of the display surface based on the vector.12-13-2012
20120317520APPARATUS AND METHOD FOR PROVIDING A DYNAMIC USER INTERFACE IN CONSIDERATION OF PHYSICAL CHARACTERISTICS OF A USER - Provided are an apparatus and method for providing a user interface and a terminal employing the same. The apparatus is capable of dynamically changing a graphical object according to the physical characteristics of a user or in an effort to prevent muscle stress to the user.12-13-2012
20120084738USER INTERFACE WITH STACKED APPLICATION MANAGEMENT - Methods and apparatus for controlling a computing device using gesture inputs. The gesture inputs may be operative to move screens corresponding to applications executing on the handheld computing device from one display to another. Additionally, a multi portion gesture may be used to target different screens. For example, a first portion of the gesture may maintain or “pin” a screen in a display such that a second portion of the gesture is operative to move a different screen behind the pinned application.04-05-2012
20120084737GESTURE CONTROLS FOR MULTI-SCREEN HIERARCHICAL APPLICATIONS - Control of a computing device using gesture inputs. The computing device may be a handheld computing device with a plurality of displays. The displays may be capable of displaying a graphical user interface (GUI). The plurality of displays may be modified in response to receipt of a gesture input such that a hierarchical application having related GUI screens are modified in response to the gesture input. The modification may include changing the hierarchical application from being displayed in a single screen mode to being displayed in a multi screen mode or vice versa.04-05-2012
20120084736GESTURE CONTROLLED SCREEN REPOSITIONING FOR ONE OR MORE DISPLAYS - Control of a computing device using gesture inputs. The computing device may be a handheld computing device with a plurality of displays. The displays may be capable of displaying a graphical user interface (GUI). The plurality of displays may be modified in response to receipt of a gesture input such that the displays are changed from a first state to a second state. The change of the displays from the first state to the second state may include moving a GUI from a first display to a second display. Additionally, a second GUI may be moved from the second display to the first display. The gesture input may comprise multiple touches, such as a pinch gesture.04-05-2012
20120084735GESTURE CONTROLS FOR MULTI-SCREEN USER INTERFACE - Method and apparatus for controlling a computing device using gesture inputs. The computing device may be a handheld computing device with multiple displays. The displays may be capable of displaying a graphical user interface (GUI). The GUI may be a multi screen GUI or a single screen GUI such that receipt of gesture inputs may result in the movement of a GUI from one display to another display or may result in maximization of a multi screen GUI across multiple displays.04-05-2012
20120084734MULTIPLE-ACCESS-LEVEL LOCK SCREEN - A multiple-access-level lock screen system allows different levels of functionality to be accessed on a computing device. For example, when a device is in a locked state, a user can select (e.g., by making one or more gestures on a touchscreen) a full-access lock screen pane and provide input that causes the device to be fully unlocked, or a user can select a partial-access lock screen pane and provide input that causes only certain resources (e.g., particular applications, attached devices, documents, etc.) to be accessible. Lock screen panes also can be selected (e.g., automatically) in response to events. For example, when a device is in a locked state, a messaging access lock screen pane can be selected automatically in response to an incoming message, and a user can provide input at the messaging access lock screen pane that causes only a messaging application to be accessible.04-05-2012
20120260220PORTABLE ELECTRONIC DEVICE HAVING GESTURE RECOGNITION AND A METHOD FOR CONTROLLING THE SAME - The present disclosure provides a portable electronic device having gesture recognition and a method for controlling the same. In accordance with one example embodiment, the method comprises: sensing distortion of the portable electronic device from a neutral state; determining an action associated with a sensed distortion; and causing the determined action to be performed.10-11-2012
20110004853Method for multiple touch modes, method for applying multi single-touch instruction and electronic device performing these methods - A method for multiple touch modes, a method for applying multi single-touch instruction, and an electronic device performing these methods are disclosed. The method for multiple touch modes comprises the following steps: receiving at least one instruction; determining whether the at least one instruction comprises a start instruction; if yes, determining whether the at least one instruction is a multi single-touch instruction; and if yes, performing a multi single-touch operation corresponding to the at least one instruction.01-06-2011
20110131537METHOD AND APPARATUS FOR PROVIDING USER INTERFACE OF PORTABLE DEVICE - A method includes displaying a user interface for displaying a graphic and a hidden graphic in a first area; displaying a set of contents corresponding to the graphic in a second area distinguishable from the first area; detecting a user's gesture for selecting a part of the first area; enlarging the first area to include a part of the second area; displaying a plurality of graphics including the graphic and the hidden graphic in the extended first area in response to the user's gesture; detecting a user's additional gesture for moving a first graphic among the plurality of graphics; and moving the first graphic to a part of the extended first area in response to the user's additional gesture, and moving a second graphic of the plurality of graphics to an area from which the first graphic is moved out.06-02-2011
20120324403METHOD OF INFERRING NAVIGATIONAL INTENT IN GESTURAL INPUT SYSTEMS - In a processing system having a touch screen display, a method of inferring navigational intent by a user in a gestural input system of the processing system is disclosed. A graphical user interface may receive current gestural input data for an application of the processing system from the touch screen display. The graphical user interface may generate an output action based at least in part on an analysis of one or more of the current gestural input data, past gestural input data for the application, and current and past context information of usage of the processing system. The graphical user interface may cause performance of the output action.12-20-2012
20110214094HUMAN-MACHINE INTERFACE - A human-machine interface includes a panel formed of energy transmissive material having a contact surface on which one or more contacts may simultaneously be made. An energy source directs energy to the panel. The panel transmits energy received from the energy source to the contact surface. At least one parameter of the energy transmitted by the panel is altered at regions where contacts are made with the contact surface. A detector is coupled to the panel and detects the at least one parameter of the energy generally over the area of the contact surface and outputs values corresponding thereto. A processor is in communication with the detector. The processor processes the output values to determine the locations of the contact regions on the contact surface and at least one other attributed associated with each of the contacts.09-01-2011
20110214093STORAGE MEDIUM STORING OBJECT CONTROLLING PROGRAM, OBJECT CONTROLLING APPARATUS AND OBJECT CONTROLLING METHOD - A game apparatus includes a first LCD and a second LCD, and on the second LCD, a touch panel is provided. On the first LCD, an enemy object is displayed. On the second LCD, a drawn object generated according to a touch operation is displayed. When the drawn object is generated, the drawn object is moved according to a moving course based on a locus of the touch operation. When the drawn object hits the enemy object, the enemy object is damaged.09-01-2011
20100251188Method of Determining Input Pattern and Computer Readable Storage Medium - A method of determining input pattern is adapted to be implemented on an electronic apparatus equipped with a touch panel and includes the steps of: detecting a plurality of boundary points between an input pattern inputted through the touch panel and a circumscribed polygon of the input pattern, detecting an area ratio of a polygon defined by the boundary points to the circumscribed polygon, and determining the shape of the input pattern at least according to the area ratio. The present invention also provides a computer readable storage medium having a program stored therein. When the program is executed which enables an electronic apparatus equipped with a touch panel to determine the shape and/or direction of an input pattern inputted through the touch panel.09-30-2010
20120089952APPARATUS AND METHOD FOR ADAPTIVE GESTURE RECOGNITION IN PORTABLE TERMINAL - An apparatus and method for gesture recognition in a portable terminal. An operation of the portable terminal includes determining a user situation by using at least one situation information, determining a user's gesture by using at least one sensor measurement value, and performing a function corresponding to the user situation and the gesture.04-12-2012
20100229130Focal-Control User Interface - A user interface and techniques for manipulating a graphical representation via indirect manipulation of focal controls are described. Generally, the user interface includes a graphical representation (e.g., an image, video, application, browser, map, etc.), one or more visible or transparent focal controls, and gesture detection functionality to detect inputs from a user. The user may provide this input via a peripheral device (e.g., a mouse, keyboard, etc.), a touch-screen display, or in another suitable manner. In each instance, the user provides an input relative to the focal control and, in response to detecting the input, the gesture detection functionality manipulates the underlying graphical representation.09-09-2010
20100229129CREATING ORGANIZATIONAL CONTAINERS ON A GRAPHICAL USER INTERFACE - Embodiments related to the formation of an organizational container on a touch-sensitive graphical user interface are disclosed. One disclosed embodiment provides a method of forming an organizational container comprising receiving a touch gesture at the graphical user interface, the touch gesture defining a set of zero or more content items to be grouped together and further defining a region of the touch-sensitive graphical user interface. The method further comprises forming an organizational container responsive to receiving the touch gesture at the touch-sensitive graphical user interface, presenting a boundary defining the organizational container, moving the set of content items into the organizational container, and presenting the set of content items arranged within the boundary according to an organized view.09-09-2010
20120096411Navigation in a Display - An apparatus comprising: including a controller configured to switch a continuous navigation mode to a discontinuous navigation mode in response to a predefined discontinuous navigation input and configured to switch a discontinuous navigation mode to a continuous navigation mode in response to a predefined continuous navigation input.04-19-2012
20110283241Touch Gesture Actions From A Device's Lock Screen - Embodiments enable a mobile device to execute an action analogous to a user-defined action in response to receipt of a gesture analogous to a user-defined gesture. In a first embodiment, a computer-implemented method executes an action on a mobile device. A lock screen view is displayed on the mobile device to prevent unauthorized and inadvertent access to the mobile device's data. While the mobile device is locked, a touch gesture having a pre-defined shape is detected on a touch screen of the mobile device independently of the initial position of the touch gesture on the touch screen. In response to detection of the touch gesture, a particular action is executed on the mobile device while the mobile device stays locked. The particular action determined according to the pre-defined shape. In this way, detection of the touch gesture causes the particular action to execute while keeping the mobile device locked.11-17-2011
20120331424ELECTRONIC DEVICE AND METHOD OF DISPLAYING INFORMATION IN RESPONSE TO INPUT - A method includes displaying, in a window or field, first information associated with a first source running on a portable electronic device and detecting an input to display second information associated with a second source. After the detecting, second information associated with the second source and the first information in the window or field is displayed.12-27-2012
20120102439SYSTEM AND METHOD OF MODIFYING THE DISPLAY CONTENT BASED ON SENSOR INPUT - A display system comprised of: a display including a display screen configured to operate in at least a transparent display mode; an interaction sensing component for receiving sensed data regarding physical user interactions; and an interaction display control component, wherein responsive to the sensed data meeting predefined interaction criteria, content on the display screen is modified.04-26-2012
20120102438DISPLAY SYSTEM AND METHOD OF DISPLAYING BASED ON DEVICE INTERACTIONS - The present invention describes a display system capable of interacting with an interfacing device positioned behind a display screen. The display system includes a display, including a display screen that in one embodiment is transparent. The display system further includes: a viewpoint assessment component for determining a viewpoint of a user positioned in front the display screen and an object tracking component for tracking the user manipulation of an object positioned behind the display screen. The display system includes an interaction tracking component. The interaction tracking component receives data regarding predefined interactions with the interfacing device. Responsive to the predefined interactions with the interfacing device, content on the display screen is modified.04-26-2012
20120102437Notification Group Touch Gesture Dismissal Techniques - In exemplary embodiments, multiple notifications can be displayed by a touch-screen of a computing device and dismissed as a group. For example, touch input sensed by the touch-screen can be used to select multiple notifications. The multiple notifications can then be dismissed using a dismissal gesture. In addition to the foregoing, other aspects are described in the detailed description, claims, and figures.04-26-2012
20120102436APPARATUS AND METHOD FOR USER INPUT FOR CONTROLLING DISPLAYED INFORMATION - In accordance with an example embodiment of the present invention, a method for proximity based input is provided, comprising: detecting presence of an object in close proximity to an input surface, detecting a displayed virtual layer currently associated with the object on the basis of distance of the object to the input surface, detecting a hovering input by the object, and causing a display operation to move at least a portion of the associated virtual layer in accordance with the detected hovering input.04-26-2012
20100199227IMAGE COLLAGE AUTHORING - A user interface that includes a catalog area, a collage mock-up area, and a mode select interface control operable to select an operational state of the user interface is displayed. Thumbnails of respective images are shown in the catalog area. A layout of a subset of the images is presented in the collage mock-up area. In response to the receipt of a user input gesture and a determination that the user interface is in a first operational state, a first action type is performed based on the type of the received user input gesture and the object type of the target object. In response to the receipt of the user input gesture and a determination that the user interface is in a second operational state, a second action type is performed based on the type of the received user input gesture and the object type of the target object.08-05-2010
20100199226Method and Apparatus for Determining Input Information from a Continuous Stroke Input - An apparatus, comprising a processor configured to receive a continuous stroke input related to a virtual keypad, determine a first input information based at least in part on said continuous stroke input, display a shape associated with said first input information, receive input associated with said shape, and determine a second input information based at least in part on said shape and said input associated with said shape is disclosed.08-05-2010
20100199232Wearable Gestural Interface - This invention may be implemented as a wearable apparatus comprised of a camera, a projector, a mirror, a microphone and a digital computer. The camera captures visual data. This data is analyzed by the digital computer to recognize objects and hand gestures, using color tracking and edge detection techniques. The projector is used, along with a mirror to adjust the direction of the projected light, to project images on objects in the user's environment. For example, the images may be projected on surfaces such as a wall, table, or piece of paper. The projected images may contain information relevant to the object being augmented. Indeed, the information may include current data obtained from the Internet. Also, the projected images may comprise graphical interfaces, with which a user may interact by making hand gestures.08-05-2010
20100199231PREDICTIVE DETERMINATION - Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data.08-05-2010
20100199230GESTURE RECOGNIZER SYSTEM ARCHITICTURE - Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data.08-05-2010
20100199229MAPPING A NATURAL INPUT DEVICE TO A LEGACY SYSTEM - Systems and methods for mapping natural input devices to legacy system inputs are disclosed. One example system may include a computing device having an algorithmic preprocessing module configured to receive input data containing a natural user input and to identify the natural user input in the input data. The computing device may further include a gesture module coupled to the algorithmic preprocessing module, the gesture module being configured to associate the natural user input to a gesture in a gesture library. The computing device may also include a mapping module to map the gesture to a legacy controller input, and to send the legacy controller input to a legacy system in response to the natural user input.08-05-2010
20100199228Gesture Keyboarding - Systems, methods and computer readable media are disclosed for gesture keyboarding. A user makes a gesture by either making a pose or moving in a pre-defined way that is captured by a depth camera. The depth information provided by the depth camera is parsed to determine at least that part of the user that is making the gesture. When parsed, the character or action signified by this gesture is identified.08-05-2010
20100146457Interactive Display - A pen-based calculator, wherein the user is presented with a screen and a gesture based input device by means of which a sum can be “handwritten” and displayed on the screen. The visibility on the screen of the system's status is provided through two types of feedback: annotation and morphing. As the user is writing the expression, the system can process in the background and, as a symbol is recognised, the user is made aware of this recognition by visual feedback: a typeset character stretched to the stroke's hull replaces the written strokes. Morphing formats the entered expression into a correctly typeset equation by moving the symbols as little as possible from the user's writing, and the morph provides the continuity between the user's input and the typeset equation that allows them to continue to edit and use it. Finally, the answer may be displayed at the correct location relative to the equation.06-10-2010
20100131904TILTABLE USER INTERFACE - A programmable effects system for graphical user interfaces is disclosed. One embodiment comprises adjusting a graphical user interface in response to a tilt of a device. In this way, a graphical user interface may have viewable content not shown in a first view, where the viewable content may be displayed in a tilted view in response to the device tilt.05-27-2010
20120151421ENHANCED DETECTION OF CIRCULAR ENGAGEMENT GESTURE - The enhanced detection of a circular engagement gesture, in which a shape is defined within motion data, and the motion data is sampled at points that are aligned with the defined shape. It is determined whether a moving object is performing a gesture correlating to the defined shape based on a pattern exhibited by the sampled motion data. An application is controlled if determining that the moving object is performing the gesture.06-14-2012
20120151420Devices, Systems, and Methods for Conveying Gesture Commands - Devices, systems, and methods are disclosed which relate to conveying gestures associated with commands by displaying images that a user associates with a gesture. Upon performance of the gesture, the commands are carried out by a device, system, etc. For example, a mobile device displays a gesture icon of a hammer. The gesture icon is labeled with a command. When a user makes a downward motion with the forearm, the mobile device senses that gesture through a gesture sensor. The mobile device interprets the gesture and executes the command in the label of the gesture icon.06-14-2012
20110161892Display Interface and Method for Presenting Visual Feedback of a User Interaction - A display interface and method for presenting visual feedback of a user interaction with an image being presented via an electronic device are provided. The display interface includes a display adapted for visually presenting to the user at least a portion of an image. The display interface further includes a user input adapted for receiving a gesture including a user interaction with the electronic device. The display interface still further includes a controller. The controller is adapted for associating the gesture with a function, determining whether the associated function has reached a limit which would preclude execution of the associated function, and executing the function associated with the detected gesture. If the associated function has not reached the limit which would preclude execution of the associated function, then controller is adapted for executing the function associated with the detected gesture. If the associated function has reached the limit which would preclude execution of the associated function, then the controller is adapted for producing an image distortion proximate the user interaction.06-30-2011
20110161890Using multi-modal input to control multiple objects on a display - Embodiments of the invention are generally directed to systems, methods, and machine-readable mediums for implementing gesture-based signature authentication. In one embodiment, a system may include several modal input devices. Each modal input device is capable of retrieving a stream of modal input data from a user. The system also includes modal interpretation logic that can interpret each of the retrieved modal input data streams into a corresponding of set of actions. The system additionally includes modal pairing logic to assign each corresponding set of actions to control one of the displayed objects. Furthermore, the system has modal control logic which causes each displayed object to be controlled by its assigned set of actions.06-30-2011
20130024821METHOD AND APPARATUS FOR MOVING ITEMS USING TOUCHSCREEN - A method for moving or copying displayed items in response to at least one touch event on a touch screen is provided. The method comprises: detecting a user selection of one or more items; displaying an object corresponding to the one or more selected items at a predetermined location; moving the object to a point selected by the user; and moving the selected one or more items in association with the object. The object can be a convergence of a first object depicting a number representing a numerical count of the selected items, and a object visually representing the items.01-24-2013
20130024820MOVING A GRAPHICAL SELECTOR - In general, this disclosure describes techniques for moving a graphical selector. In one example, a method includes activating, by a computing device, a graphical key that is displayed with a presence-sensitive interface of the computing device. Upon activation of the graphical key, the method also includes receiving gesture input corresponding to a directional gesture using the presence-sensitive interface of the computing device and moving a graphical selector displayed with the presence-sensitive interface from a first graphical location to a second graphical location by at least one selected increment based on a property of the gesture input.01-24-2013
20130174100Device, Method, and Graphical User Interface for Configuring Restricted Interaction with a User Interface - An electronic device, while in an interaction configuration mode: displays a first user interface that includes a plurality of user interface objects; and, while displaying the first user interface, detects one or more gesture inputs on a touch-sensitive surface. For a respective gesture input, the device determines whether one or more user interface objects of the plurality of user interface objects correspond to the respective gesture input. The device visually distinguishes a first set of user interface objects in the plurality of user interface objects that correspond to the detected one or more gesture inputs from a second set of user interface objects in the plurality of user interface objects that do not correspond to the detected one or more gesture inputs. The device detects an input; and, in response to detecting the input, exits the interaction configuration mode and enters a restricted interaction mode.07-04-2013
20130174101ELECTRONIC APPARATUS AND METHOD OF CONTROLLING THE SAME - An electronic apparatus and a method of controlling the electronic apparatus are provided. The method includes: receiving a two hand start command which is to perform a motion task using two hands; if the two hand start command is input, changing a mode of the electronic apparatus to a two hand task mode which is to perform the motion task using the two hands; and if the mode of the electronic apparatus is changed to the two hand task mode, displaying a two hand input guide graphical user interface (GUI) which is to perform the motion task using the two hands. Therefore, a user further intuitively and conveniently perform a function of the electronic apparatus, such as a zoom-in/zoom-out, by using two hands.07-04-2013
20080229256Image processing apparatus, image processing method and computer readable medium - The image processing apparatus is provided with: a display that displays an object; a receiving device that receives a specified position specified by a user on the display; a holding device that holds setting of a process for the object; a judgment device that judges whether or not the specified position received by the receiving device is included in the display areas of a plural objects overlappedly; and an execution device that executes the process for selecting the plural objects of which the display areas include the specified position received by the receiving device according to the setting held in the holding device, when the judgment device judges that the specified position received by the receiving device is included in the display areas of the plural objects overlappedly.09-18-2008
20110246952ELECTRONIC DEVICE CAPABLE OF DEFINING TOUCH GESTURES AND METHOD THEREOF - An electronic device capable of defining touch gesture is provided. The electronic device includes a touch panel, a touch detection circuit, a processing unit, and a storage unit. The touch detection circuit produces touch signals in response to user touch operation on the touch panel. The processing unit includes a gesture recognition module and a defining module. The gesture recognition module receives the touch signals from the touch detection circuit, and recognizes touch gesture according to the touch signals. The defining module controls the electronic device to enter a defining module in response to user operation, and prompts the user to define a touch gesture for a function of the electronic device, and establish a map between the touch gesture input by the user and the function selected by the user, and store the map in the storage unit.10-06-2011
20110246951PORTABLE DEVICE AND UNLOCKING METHOD THEREOF - A portable device and an unlocking method stores information groups, each information group including one primary key and at least one subordinate key. The portable device obtains the primary key, at least one subordinate key in one information group, and at least one subordinate key in another information group, and displays as an unlocking image. The portable device further detects user selection, and determine whether the user selection are the primary key and at least one subordinate key in the same information group. The portable device further switches from a lock state to an unlock state, when the user selection are the primary key and at least one subordinate key in the same information group.10-06-2011
20080222575Device Comprising a Detector for Detecting an Uninterrupted Looping Movement09-11-2008
20130179845METHOD AND APPARATUS FOR DISPLAYING KEYPAD IN TERMINAL HAVING TOUCH SCREEN - A method and an apparatus thereof display a key pad and solve the trouble and difficulty of a user in selecting the key pad displayed in a terminal having a touch screen. The method detects a touch gesture with respect to the touch screen; determines whether the detects touch gesture is zoom-out; displays thumbnails representing key pads, respectively, when the detected touch gesture is the zoom-out; and displays a key pad of a selected thumbnail when one of the displayed thumbnails is selected by a user.07-11-2013
20130179844Input Pointer Delay - Various embodiments enable repetitive gestures, such as multiple serial gestures, to be implemented efficiently so as to enhance the user experience. In at least some embodiments, a first gesture associated with an object is detected. The first gesture is associated with a first action. Responsive to detecting the first gesture, pre-processing associated with the first action is performed in the background. Responsive to detecting a second gesture associated with the object within a pre-defined time period, an action associated with the second gesture is performed. Responsive to the second gesture not being performed within the pre-defined time period, processing associated with the first action is completed.07-11-2013
20110271235Method for displaying a setting menu and corresponding device - The invention relates to a method for displaying a settings menu. In order to optimize the graphical representation of setting carried out by a spectator, the method comprises steps for: 11-03-2011
20130097566SYSTEM AND METHOD FOR DISPLAYING ITEMS ON ELECTRONIC DEVICES - A method, computer readable storage medium, and electronic device are provided which display items on such an electronic device by displaying a first set of items, receiving a first gesture on a touch-sensitive panel of the electronic device, and displaying a second set of items, the second set of items being a subset of the first set of items.04-18-2013
20130097565LEARNING VALIDATION USING GESTURE RECOGNITION - Embodiments are disclosed that relate to assessing a user's ability to recognize a target item by reacting to the target item and performing a target gesture. For example, one disclosed embodiment provides a method of assessing a user's ability to recognize a target item from a collection of learning items that includes the target item. The method includes providing to a display device the learning items in a sequence and receiving input from a sensor to recognize a user gesture made by the user. If the user gesture is received within a target timeframe corresponding to the target item, then the method includes determining whether the user gesture matches a target gesture. If the user gesture matches the target gesture, then the method includes providing to the display device a reward image for the user.04-18-2013
20110314425Air gesture recognition type electronic device operating method - An air gesture recognition type electronic device operating method for operating an electronic device having multiple sensors in each of multiple peripheral sides thereof by: approaching an object to the sensors to produce sensing signals and determining whether or not the object has been continuously sensed, and then determining whether or not the moving direction and moving speed of the object match a respective predetermined value, and then coupling and computing all received sensing signals to produce an operating parameter for running an air gesture application procedure. Thus, a user can operate the electronic device without direct contact or the use of any camera or input media, saving the hardware cost and enhancing the operational flexibility.12-22-2011
20130104089GESTURE-BASED METHODS FOR INTERACTING WITH INSTANT MESSAGING AND EVENT-BASED COMMUNICATION APPLICATIONS - Gesture-based methods of managing communications of a user participating in communication sessions permit the user to easily manage the communications sessions by defining gestures, defining a meaning of the gesture, and outputting the meaning of the gesture to a communication session when the gesture is detected. The gestures may be contextually dependent, such that a single gesture may generate different output, and may be unconventional to eliminate confusion during gesturing during the communication sessions, and thereby the communications sessions may be more effectively managed.04-25-2013
20130104090DEVICE AND METHOD FOR SELECTION OF OPTIONS BY MOTION GESTURES - A method for selection of an option on a device is provided where the device is enabled for option selection through motion gestures by a user. The method comprises providing at least one option for a first input request and announcing the first input request and at least one option of the first input request. A first motion gesture is detected, and the device determines whether the first motion gesture corresponds to a positive selection or a negative selection, wherein a control module of the device determines whether the first motion gesture meets a threshold for a positive gesture selection or a negative gesture selection. The device advances to a second option and announces the second option upon the determination of a negative selection as the first motion gesture. The selected option for the first input request is stored in a memory of the device after a positive selection.04-25-2013
20130125069System and Method for Interactive Labeling of a Collection of Images - Various embodiments of a system and method for labeling images are described. An image labeling system may label multiple images, where each of the images may be labeled to identify image content or image elements, such as backgrounds or faces. The system may display some of the labeled image elements in different portions of a display area. Unlabeled image elements may be displayed in the same display area. The display size and position of each unlabeled image element may be dependent on similarities between the unlabeled image element and the displayed, labeled image elements. The system may also receive gesture input in order to determine a corresponding labeling task to perform.05-16-2013
20130125068Methods and Apparatus for Natural Media Painting Using a Realistic Brush and Tablet Stylus Gestures - Systems and methods for providing a natural media painting application may receive user inputs through tablet stylus gestures. A user interface may detect stylus gestures that mimic real-world actions of artists based on information collected during user manipulation of the stylus, and may map the gestures to various digital painting and image editing tasks that may be invoked and/or controlled using the gesture-based inputs. The collected information may include spatial and/or directional information, acceleration data, an initial and/or ending position of the stylus, an initial and/or ending orientation of the stylus, and/or pressure data. The stylus gestures may include translations, rotations, twisting motions, mashing gestures, or jerking motions. The application may perform appropriate painting and image editing actions in response to detecting and recognizing the stylus gestures, and the actions taken may be dependent on the work mode and/or context of the graphics application in which stylus gesture was performed.05-16-2013
20130145327Interfaces for Displaying an Intersection Space - User-submitted content (e.g., stories) may be associated with descriptive metadata, such as a timeframe, location, tags, and so on. The user-submitted content may be browed and/or searched using the descriptive metadata. Intersection criteria comprising a prevailing timeframe, a location, and/or other metadata criteria may be used to identify an intersection space comprising one or more stories. The stories may be ordered according to relative importance, which may be determined (at least in part) by comparing story metadata to the intersection criteria. Stories may be browsed in an intersection interface comprising a timeframe control. The intersection interface (and the timeframe control) may be configured to receive inputs in various forms including gesture input, movement input, orientation input, and so on.06-06-2013
20110219340SYSTEM AND METHOD FOR POINT, SELECT AND TRANSFER HAND GESTURE BASED USER INTERFACE - A system and method for a point, select and transfer hand gesture based user interface is disclosed. In one embodiment, a depth image of a hand gesture is captured using an in-front camera substantially on a frame by frame basis within a predefined interaction volume. Also, a nearest point of the hand gesture to a display screen of a display device is found using a substantially nearest depth value in the captured depth image for each frame. Further, an image-to-screen mapping of the captured depth image and the found nearest point to the display screen is performed upon validating the found nearest point as associated with the hand for each frame. Moreover, one of select options displayed on the display screen is pointed and selected when the substantially nearest depth value is within one or more predetermined threshold ranges, and based on the outcome of the image-to-screen mapping.09-08-2011
20100287513MULTI-DEVICE GESTURE INTERACTIVITY - A system is provided for enabling cross-device gesture-based interactivity. The system includes a first computing device with a first display operative to display an image item, and a second computing device with a second display. The second display is operative to display a corresponding representation of the image item in response to a gesture which is applied to one of the computing devices and spatially interpreted based on a relative position of the first computing device and the second computing device.11-11-2010
20110239166METHOD AND SYSTEM FOR CONTROLLING FUNCTIONS IN A MOBILE DEVICE BY MULTI-INPUTS - A method and system for providing control functions in a mobile device according to modes of inputs are provided. The method includes receiving a proximity signal via a sensing module, detecting a touch signal via a touch screen while the proximity signal is being retained, and executing a function set according to the input mode of the touch signal.09-29-2011
20130152024ELECTRONIC DEVICE AND PAGE ZOOMING METHOD THEREOF - A page zooming method for an electronic device having a touch screen and a storage unit is provided. The method includes the following steps: generating operation signals in response to a touch operation applied on a page displayed on the touch screen; determining the touch operation being a zooming gesture if the touch operation comprising a press operation and a slide operation at a same time; determining the slide direction and determining the type of the zooming gesture according to the determined slide direction of the slide operation, the type of the zooming gesture comprising a zooming in gesture and a zooming out gesture; creating a zoomed page of the page displayed on the touch screen according to the type of the zooming gesture; and displaying the zoomed page on the touch screen. An electronic device using the page zooming method is also provided.06-13-2013
20100299642Electronic Device with Sensing Assembly and Method for Detecting Basic Gestures - An electronic device and method are described for detecting a predefined gesture that is a specified pattern of movement of an external object relative to the electronic device. The method includes providing as part of the electronic device a sensing assembly including at least one photoreceiver and a plurality of phototransmitters, wherein each phototransmitter is positioned to emit infrared light away from the electronic device about a corresponding central transmission axis, wherein each central transmission axis is oriented in a different direction with respect to the others. The emission of infrared light by each of the phototransmitters is controlled during each of a plurality of sequential time periods, and wherein the external, object moves in the specified pattern of movement during the plurality of sequential time periods. For each of the plurality of phototransmitters and for each of the plurality of sequential time periods, a corresponding measured signal indicative of a respective amount of infrared light which originated from that phototransmitter during that time period and was reflected by the external object prior to being received by the photoreceiver is generated, and the measured signals are evaluated to detect the predefined gesture.11-25-2010
20120284674TOUCH CONTROL METHOD AND APPARATUS - Embodiments of the present disclosure disclose a touch control method and an apparatus. The method includes: entering, when it is detected that a user triggers a function control, a function state corresponding to the function control; detecting a touch control operation performed by the user on an operation object on a touch control panel; under the function state corresponding to the function control, performing corresponding processing on the operation object according to the touch control operation of the user.11-08-2012
20120284673METHOD AND APPARATUS FOR PROVIDING QUICK ACCESS TO DEVICE FUNCTIONALITY - A method for providing quick access to device functionality responsive to a touch gesture may include receiving an indication of a swipe gesture being performed from a first predefined portion of a display to a second predefined portion of a touch screen display, classifying the swipe gesture as a trigger gesture based on insertion of a motion delay of at least a threshold period of time in connection with the swipe gesture, and causing, in response to classifying the trigger gesture, a display of a predefined set of functional elements that cause execution of a corresponding function when a respective one of the predefined set of functional elements is selected. A corresponding apparatus and computer program product are also provided.11-08-2012
20130159938GESTURE INFERRED VOCABULARY BINDINGS - The subject disclosure relates to annotating data based on gestures. Gestures include user interaction with a client device or client software. Gestures are tracked and associated with data. In an aspect, client context associated with a gesture is also tracked. The gestures are then employed to determine a global term to associate with the data. In an aspect, a look-up table comprising a pre-defined relationship between gestures and a global term can be employed. In another aspect, an inference component employ context information in conjunction with the tracked gestures to determine a global term to assign to data. After a global term is determined for data based on a gesture, an annotation file for the data can be created associating the data with the global term.06-20-2013
20130159939AUTHENTICATED GESTURE RECOGNITION - Methods, apparatuses, systems, and computer-readable media for performing authenticated gesture recognition are presented. According to one or more aspects, a gesture performed by a user may be detected. An identity of the user may be determined based on sensor input captured substantially contemporaneously with the detected gesture. Then, it may be determined, based on the identity of the user, that the detected gesture corresponds to at least one command of a plurality of commands. Subsequently, the at least one command may be executed. In some arrangements, the gesture may correspond to a first command when performed by a first user, and the same gesture may correspond to a second command different from the first command when performed by a second user different from the first user.06-20-2013
20130159942INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM - An information processing apparatus includes a proximity panel, a communication module and a controller. The proximity panel receives first gesture information. The first gesture information is received in response to a trajectory of movement on the proximity panel. The communication module receives second gesture information from a computing device. The controller determines whether the first gesture information and the second gesture information correspond to predetermined gesture information. In the event that the controller determines that the first gesture information and the second gesture information correspond to the predetermined gesture information, the controller causes predetermined data to be communicated with the computing device.06-20-2013
20130159940Gesture-Controlled Interactive Information Board - A method of controlling an information board comprises the steps of sensing a gesture using a gesture capturing controller, determining a type of action having provided the gesture expression from the gesture capturing controller, where the type of command is one of a navigation request. Depending on the determined type of gesture, user interface elements of a spatial configuration are displayed.06-20-2013
20130185680Unlocking a Device by Performing Gestures on an Unlock Image - A device with a touch-sensitive display may be unlocked via gestures performed on the touch-sensitive display. The device is unlocked if contact with the display corresponds to a predefined gesture for unlocking the device. The device displays one or more unlock images with respect to which the predefined gesture is to be performed in order to unlock the device. The performance of the predefined gesture with respect to the unlock image may include moving the unlock image to a predefined location and/or moving the unlock image along a predefined path. The device may also display visual cues of the predefined gesture on the touch screen to remind a user of the gesture.07-18-2013
20110307840ERASE, CIRCLE, PRIORITIZE AND APPLICATION TRAY GESTURES - Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including an activate gesture, a fill gesture, a level gesture, a jump gesture, a checkmark gesture, a strikethrough gesture, an erase gesture, a circle gesture, a prioritize gesture, and an application tray gesture.12-15-2011
20110314430APPLICATION PROGRAMMING INTERFACES FOR GESTURE OPERATIONS - At least certain embodiments of the present disclosure include an environment with user interface software interacting with a software application to provide gesture operations for a display of a device. A method for operating through an application programming interface (API) in this environment includes transferring a scaling transform call. The gesture operations include performing a scaling transform such as a zoom in or zoom out in response to a user input having two or more input points. The gesture operations also include performing a rotation transform to rotate an image or view in response to a user input having two or more input points.12-22-2011
20110314427PERSONALIZATION USING CUSTOM GESTURES - A method and apparatus allow users of touchscreen-based devices to create custom gestures on the touchscreen that are associated with behaviors and recognized throughout the operation of the device. The method and apparatus include sensing a user interaction on a touchscreen and detecting whether the sensed user interaction is a custom gesture stored in a behavior repository, the custom gesture being a user-defined interaction on the touchscreen. A gesture processor determines a behavior that is associated with the custom gesture. A personality adapter selects an appropriate operation from a set of operations associated with the behavior based on policies for the behavior, and a main processor executes the appropriate operation.12-22-2011
20130191790INTELLIGENT GESTURE-BASED USER'S INSTANTANEOUS INTERACTION AND TASK REQUIREMENTS RECOGNITION SYSTEM AND METHOD - Methods and apparatus for determining an intended gesture-based input command from an incomplete gesture-based input command that is supplied to a gesture-based touch screen display that includes at least a touch sensitive region includes receiving an incomplete gesture-based input command on the touch sensitive region of the gesture-based touch screen device, the incomplete gesture-based input command including a gesture profile and a gesture direction. Gesture signals that include data representative of the gesture profile and the gesture direction are generated in response to the input command. The gesture signals are processed in a processor to predict the intended gesture-based input command. The intended gesture-based command is retrieved, with the processor, from an electronically stored standard gesture library.07-25-2013
20130191791ELECTRONIC DEVICE AND METHOD OF CONTROLLING A DISPLAY - A method includes entering, by a portable electronic device, a low-power condition and while in the low-power condition, detecting an input. In response to detecting the input, a cover image is displayed. An application image is progressively revealed along with movement of a gesture while reducing display of the cover image.07-25-2013
20130191789CONTROLLING A TRANSACTION WITH COMMAND GESTURES - Embodiments of the invention include systems, methods, and computer-program products that provide for a unique system for controlling transactions based on command gestures. In one embodiment of the invention, a computer-implemented method determines that a user is conducting a transaction with a mobile device. The mobile device senses a gesture performed by the user with the mobile device and alters at least one aspect of the transaction based on the gesture. The gestures can control a wide variety of aspects of the transaction. For example, the user may flag an item for review during the transaction, silence the transaction, receive a subtotal for the transaction, select a payment method, or complete the transaction. In an embodiment, the user is able to customize the gestures to control the transaction according to the user's preferences.07-25-2013
20120023462SKIPPING THROUGH ELECTRONIC CONTENT ON AN ELECTRONIC DEVICE - Embodiments of the present invention disclose a method for skipping through electronic content displayed on an electronic device having a touchscreen display coupled to a processing engine. According to one embodiment, a multi-touch gesture is received from a user. Based on the user's multi-touch gesture, electronic content associated with digital media immediately advances to a subsequent section or immediately reverses back to a previous section of the digital media.01-26-2012
20120023460APPLICATION PROGRAMMING INTERFACES FOR GESTURE OPERATIONS - At least certain embodiments of the present disclosure include an environment with user interface software interacting with a software application to provide gesture operations for a display of a device. A method for operating through an application programming interface (API) in this environment includes transferring a scaling transform call. The gesture operations include performing a scaling transform such as a zoom in or zoom out in response to a user input having two or more input points. The gesture operations also include performing a rotation transform to rotate an image or view in response to a user input having two or more input points.01-26-2012
20120023458Unlocking a Device by Performing Gestures on an Unlock Image - A device with a touch-sensitive display may be unlocked via gestures performed on the touch-sensitive display. The device is unlocked if contact with the display corresponds to a predefined gesture for unlocking the device. The device displays one or more unlock images with respect to which the predefined gesture is to be performed in order to unlock the device. The performance of the predefined gesture with respect to the unlock image may include moving the unlock image to a predefined location and/or moving the unlock image along a predefined path. The device may also display visual cues of the predefined gesture on the touch screen to remind a user of the gesture.01-26-2012
20120030637QUALIFIED COMMAND - A method for executing a qualified command including detecting a hand gesture input for identifying a command, detecting one or more non-hand gesture inputs to qualify the command, and configuring a processor to execute the qualified command on a machine in response to the hand gesture input and one or more non-hand gesture inputs.02-02-2012
20120030636INFORMATION PROCESSING APPARATUS, DISPLAY CONTROL METHOD, AND DISPLAY CONTROL PROGRAM - An information processing apparatus includes: an operation unit; and a control unit performing a process corresponding to dragging and displaying, on a display unit, a cursor which elongates from a start point of the dragging to an end point of the dragging and of which at least one of a size and a shape is different at one end portion, which is on a side of the start point of the dragging, and at the other end portion, which is on a side of the end point of the dragging, when the dragging is executed through the operation unit.02-02-2012
20120030635INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD AND INFORMATION PROCESSING PROGRAM - An information processing apparatus includes: an operation section; and a control section adapted to execute a process in response to dragging through the operation section; the control section changing, when dragging is carried out continuously after a particular operation by the operation section, a process to be executed in response to the dragging based on the particular operation.02-02-2012
20120030634INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING PROGRAM - An information processing device includes an operating unit, and a control unit for switching, when dragging is performed which is an operation to change a predetermined value via the operating unit, between changing the value by an amount equivalent to the amount of dragging, and setting a change speed of the value according to the dragging and continue to change the value at the change speed, according to the position of an ending point as to a starting point of the dragging.02-02-2012
20120030633DISPLAY SCENE CREATION SYSTEM - Provided is a display scene creation system that can cause a display scene to make a transition when a gesture is input to a touch panel, without a processing program that relates the touch panel to the gesture. A design of a display scene is set. One or more display components to be displayed in the set design of the display scene are set. A gesture with which the display scene makes a transition when the gesture is input to the set display components is set. A transition display scene table is provided that stores the set gesture and a post-transition display scene where the gesture and the post-transition display scene are associated with each other.02-02-2012
20120030632SYSTEM, METHOD AND APPARATUS FOR CONTROLLING PRESENTATION OF CONTENT - An application for a system that enables cooperating devices to transfer presentation of content from one device to the other by sending either the content or an identification of content from a source device to a destination device. In some embodiments, the actual content is transferred while in other embodiments, an identification of the content and a position within the content is transferred from the source device to the destination device.02-02-2012
20120066650Electronic Device and Method for Evaluating the Strength of a Gestural Password - An electronic device includes a movement sensing assembly for providing signals indicative of movement of an object with respect to the electronic device, wherein the movement includes a sequence of gestures making up a proposed gestural password. A processor in electronic communication with the movement sensing assembly is operable to receive and evaluate the signals to compute a password strength metric indicative of a strength of the proposed gestural password, and a user output component receives and displays an acceptability of the password strength metric.03-15-2012
20130205262METHOD AND APPARATUS FOR ADJUSTING A PARAMETER - A method comprising receiving an indication of a continuous stroke input, setting an adjustment magnitude based on a predetermined adjustment magnitude, determining that the continuous stroke input comprises a first adjustment magnitude input, adjusting the adjustment magnitude based on the magnitude adjustment input, determining that the continuous stroke input comprises a first adjustment input, and adjusting a parameter based on the adjustment magnitude and the first adjustment input is disclosed.08-08-2013
20120084739FOCUS CHANGE UPON USE OF GESTURE TO MOVE IMAGE - Embodiments are described for handling focus when a gesture is input in a multi-screen device. In embodiments, the gesture indicates a request to move an image displayed on the multi-screen device. In response, the image is moved and the focus is placed on the moved image.04-05-2012
20130091474METHOD AND ELECTRONIC DEVICE CAPABLE OF SEARCHING AND DISPLAYING SELECTED TEXT - An electronic device includes a storage unit, a touch display unit and a central processing unit. The central processing unit includes a control module, a searching module, and a spit-screen module. The control module generates a first window on the touch display unit to display a text document when the text document is opened, and determines a selected text of the displayed text document by a user according to touch positions when the touch display unit is touched. The searching module searches occurrences of the selected text in the text document, and the control module stores the searched text in the storage unit. The spit-screen module displays each occurrence of the selected text on a second window produced thereby with a size thereof smaller than that of the first window. A related method is also provided.04-11-2013
20130091473CHANGING DISPLAY BETWEEN GRID AND FORM VIEWS - Data can be displayed in a display in a first orientation. The display can include a grid view of the data. A user input can be received, where the user input directs a change of orientation of the display from the first orientation to a second orientation. For example, the user input can include rotating a display device. In response to the user input, the orientation of the display can be changed from the first orientation to the second orientation, and the grid view can be changed to a form view of the data. Also, in response to another user input such as rotating the display device, the orientation can be changed from the second orientation to the first orientation, and the display can be changed from the form view to the grid view.04-11-2013
20130212541METHOD, A DEVICE AND A SYSTEM FOR RECEIVING USER INPUT - The invention relates to a method, a device and system for receiving user input. User interface events are first formed from low-level events generated by a user interface input device such as a touch screen. The user interface events are modified by forming information on a modifier 5 for the user interface events such as time and coordinate information. The events and their modifiers are sent to a gesture recognition engine, where gesture information is formed from the user interface events and their modifiers. The gesture information is then used as user input to the apparatus. In other words, the gestures may not be 10 formed directly from the low-level events of the input device. Instead, user interface events are formed from the low-level events, and gestures are then recognized from these user interface events.08-15-2013

Patent applications in class Gesture-based