Sarrett
James Sarrett, Sunland, CA US
Patent application number | Description | Published |
---|---|---|
20130268900 | TOUCH SENSOR GESTURE RECOGNITION FOR OPERATION OF MOBILE DEVICES - Touch sensor gesture recognition for operation of mobile devices. An embodiment of a mobile device includes a touch sensor for the detection of gestures, the touch sensor including multiple sensor elements, and a processor, the processor to interpret the gestures detected by the touch sensor, where the mobile device divides the plurality of sensor elements into multiple zones, and the mobile device interprets the gestures based at least in part on which of the zones detects the gesture. An embodiment of a mobile device includes a touch sensor for the detection of gestures, the touch sensor including multiple sensor elements, and a processor, the processor to interpret the gestures detected by the touch sensor, where the processor is to identify one or more dominant actions for an active application or a function of the active application and is to choose a gesture identification algorithm from a plurality of gesture recognition algorithms based at least in part on identified one or more dominant actions, and is to determine a first intended action of a user based on an interpretation of a first gesture using the chosen gesture identification algorithm. An embodiment of a mobile device includes a touch sensor for the detection of gestures, the touch sensor including multiple sensor elements, and a processor, the processor to interpret the gestures detected by the touch sensor, and a mapping between touch sensor data and actual positions of user gestures, the mapping of data being generated by an artificial neural network, where the processor utilizes the mapping at least in part to interpret the gestures. | 10-10-2013 |
James W. Sarrett, Los Angeles, CA US
Patent application number | Description | Published |
---|---|---|
20150336502 | COMMUNICATION BETWEEN AUTONOMOUS VEHICLE AND EXTERNAL OBSERVERS - At least one embodiment of this disclosure includes a method for an autonomous vehicle (e.g., a fully autonomous or semi-autonomous vehicle) to communicate with external observers. The method includes: receiving a task at the autonomous vehicle; collecting data that characterizes a surrounding environment of the autonomous vehicle from a sensor coupled to the autonomous vehicle; determining an intended course of action for the autonomous vehicle to undertake based on the task and the collected data; and conveying a human understandable output via an output device, the human understandable output expressly or implicitly indicating the intended course of action to an external observer. | 11-26-2015 |
Peter Sarrett, Redmond, WA US
Patent application number | Description | Published |
---|---|---|
20100281438 | ALTERING A VIEW PERSPECTIVE WITHIN A DISPLAY ENVIRONMENT - Disclosed herein are systems and methods for altering a view perspective within a display environment. For example, gesture data corresponding to a plurality of inputs may be stored. The input may be input into a game or application implemented by a computing device. Images of a user of the game or application may be captured. For example, a suitable capture device may capture several images of the user over a period of time. The images may be analyzed and processed for detecting a user's gesture. Aspects of the user's gesture may be compared to the stored gesture data for determining an intended gesture input for the user. The comparison may be part of an analysis for determining inputs corresponding to the gesture data, where one or more of the inputs are input into the game or application and cause a view perspective within the display environment to be altered. | 11-04-2010 |
20120155705 | FIRST PERSON SHOOTER CONTROL WITH VIRTUAL SKELETON - A virtual skeleton includes a plurality of joints and provides a machine readable representation of a human target observed with a three dimensional depth camera. A relative position of a hand joint of the virtual skeleton is translated as a gestured aiming vector control, and a virtual weapon is aimed in proportion to the gestured aiming vector control. | 06-21-2012 |
Peter Glenn Sarrett, Redmond, WA US
Patent application number | Description | Published |
---|---|---|
20100093435 | VIRTUAL SPACE MAPPING OF A VARIABLE ACTIVITY REGION - An electronic game system and a method of its operation are provided for virtual space mapping of a variable activity region in physical space. A calibration input may be received from a positioning device of a game controller that indicates waypoints that define an activity region in physical space. A scale factor may be identified between the activity region and an interactive game region in virtual space based on the calibration input. Positioning information may be received from the positioning device that indicates a position of the positioning device within the activity region. The position of the positioning device within the activity region may be mapped to a corresponding virtual position within the interactive game region based on the scale factor. | 04-15-2010 |
20120306924 | CONTROLLING OBJECTS IN A VIRTUAL ENVIRONMENT - Methods, systems, and computer-storage media having computer-usable instructions embodied thereon, for controlling objects in a virtual environment are provided. Real-world objects may be received into a virtual environment. The real-world objects may be any non-human object. An object skeleton may be identified and mapped to the object. A user skeleton of the real-world user may also be identified and mapped to the object skeleton. By mapping the user skeleton to the object skeleton, movements of the user control the movements of the object in the virtual environment. | 12-06-2012 |