Inventors list

Assignees list

Classification tree browser

Top 100 Inventors

Top 100 Assignees

Interface represented by 3D space

Subclass of:

715 - Data processing: presentation processing of document, operator interface processing, and screen saver display processing


715764000 - On-screen workspace or object

Patent class list (only not empty are listed)

Deeper subclasses:

Class / Patent application numberDescriptionNumber of patent applications / Date published
715850000 Navigation within 3D space 59
715849000 Individual object 47
715852000 Picking 3D objects 9
20120174037Media Content User Interface Systems and Methods - Exemplary media content user interface systems and methods are disclosed herein. An exemplary method includes a media content access subsystem displaying a plurality of display elements representative of a plurality of media content instances and that flow through a graphical representation of a water cycle in accordance with one or more flow heuristics, detecting a user interaction, and dynamically adjusting the flow of the one or more display elements in accordance with the user interaction. Corresponding systems and methods are also disclosed.07-05-2012
20110202885Apparatus and method for comparing, sorting and presenting objects - An apparatus for comparing, sorting and presenting an objects comprising a storage module (08-18-2011
20120246599INTUITIVE DATA VISUALIZATION METHOD - A computer software program, method and system has a data visualization scheme in the form of plural identifiable virtual characters in a familiar virtual environment that is relevant for the characters and in which the characters act in the context of the environment and in a manner that is indicative of the data or data set portrayed by each character. From the actions and interactions of the virtual characters in the context of the virtual environment, information about the nature and interactions of the data and data sets is quickly and intuitively appreciated by a viewer.09-27-2012
20100333037DIORAMIC USER INTERFACE HAVING A USER CUSTOMIZED EXPERIENCE - The present disclosure teaches a solution for a user customizable abstraction layer for tailoring all operating system, application, and web based interfaces. The interface differs from conventional user interfaces by presenting a dynamic interface which can enable user access across all domains and applications with which the user can interact. The interface can be dynamically built as a user interacts with clients (e.g., devices/applications). Clients can utilize common usage patterns, installed application, installed themes, personal information, and the like, to create a highly customized adaptive user designed and modifiable interface.12-30-2010
20100107127Apparatus and method for manipulating virtual object - Disclosed is a virtual object manipulating apparatus and method. The virtual object manipulating apparatus connects a virtual object in a 3D virtual world with a virtual object manipulating apparatus, senses a grab signal from a user, and determines a grab type of the virtual object based on the sensed grab signal and the connection between the virtual object and the virtual object manipulating apparatus.04-29-2010
20110197167ELECTRONIC DEVICE AND METHOD FOR PROVIDING GRAPHICAL USER INTERFACE (GUI) - An electronic device and a method for providing a Graphical User Interface (GUI) are disclosed. The electronic device includes a storage configured to store a first set of pixel data and a second set of pixel data, a controller configured to detect a request for providing a Graphical User Interface (GUI) for recording reservation and access the first set of pixel data and the second set of pixel data in response to detecting the request and a formatter configured to convert format of the first set of pixel data and the second set of pixel data to output 3D format.08-11-2011
20120192114THREE-DIMENSIONAL, MULTI-DEPTH PRESENTATION OF ICONS ASSOCIATED WITH A USER INTERFACE - A three-dimensional display presents a plurality of icons that are associated with a user interface. These icons include at least a first icon presented at a first depth of presentation and at least a second icon presented at a second, different depth of presentation. By one approach this first icon is available for interaction by an input component of the user interface while the second icon is unavailable for interaction by the input component of the user interface. The aforementioned first depth of presentation may be substantially coincide with a surface, for example, a touch-sensitive display, of the corresponding electronic device. So configured, the first icon (which is presently available for selection) appears at a depth that coincides with that surface. This approach can serve to facilitate three-dimensional presentation of an icon based on whether it is available for interaction via an input component of a user interface.07-26-2012
20090094558Viewport overlays to expose alternate data representations - A method, apparatus, and article of manufacture provide the ability to display (using a 3D graphics application) an overlayed window containing an alternate data representation in a three-dimensional system. A first 3D view of a real world scene (that includes a first set of data layers) is displayed on a display device. The user selects a set of entities that together define an alternate representation of the first 3D view. The alternate representation is a second set of data layers that is different than the first set of data layers. An overlayed window is displayed on top of the first 3D view and displays the alternate representation.04-09-2009
20110022988PROVIDING USER INTERFACE FOR THREE-DIMENSIONAL DISPLAY DEVICE - A 3D display device is provided. The 3D display device provides a 3D preview image to be displayed and a control menu for setting various parameters for the 3D preview image to a user and thus enables the user to optimize the 3D parameters before viewing 3D content and view the 3D content01-27-2011
20080295036Display Control Apparatus, Display Method, and Computer Program - A display control apparatus includes a search unit for searching for a second content related to a first content in accordance with at least part of metadata attached to each of the first content and the second content, a generating unit for generating a three-dimensional display model, the three-dimensional display model including a first layer and a second layer, the first layer having one of a first image and a first character representing the first content arranged therewithin, and the second layer having one of a second image and a second character representing the second content arranged therewithin, and a display control unit for controlling displaying one of the first image and the first character and one of the second image and the second character using the three-dimensional display model.11-27-2008
20090094557SUN-SHADOW SIMULATION IN A GEOSPATIAL SYSTEM - A method, apparatus, and article of manufacture provide the ability to display a sun and shadow simulation in a 3D system. A 3D view of a real world scene is displayed, using a 3D graphics application, on a display device. A plug-in is installed into the application. A calendar period (e.g., a month, day, and year) is defined by the user. A timeline arc is displayed with the calendar period defining a radius of the arc, and starting stopping endpoints of the timeline arc defining an interval of time during the calendar period. A timeline slider is displayed on the arc that indicates a time of day within the calendar period. A visualization is displayed, in the 3D view, of shadows cast by a sun on objects in the 3D view. A position of the sun is based on the calendar period and the time of day.04-09-2009
20120290987System and Method for Virtual Object Placement - A computer system and method according to the present invention can receive multi-modal inputs such as natural language, gesture, text, sketch and other inputs in order to manipulate graphical objects in a virtual world. The components of an agent as provided in accordance with the present invention can include one or more sensors, actuators, and cognition elements, such as interpreters, executive function elements, working memory, long term memory and reasoners for object placement approach. In one embodiment, the present invention can transform a user input into an object placement output. Further, the present invention provides, in part, an object placement algorithm, along with the command structure, vocabulary, and the dialog that an agent is designed to support in accordance with various embodiments of the present invention.11-15-2012
20110302535Method for selection of an object in a virtual environment - The invention relates to a method for selection of a first object in a first virtual environment, the first object being represented in the first environment with a size of value less than a threshold value. In order to make the selection of the first object more convivial, the method comprises steps for: 12-08-2011
20110296353Method and system implementing user-centric gesture control - A user-centric method and system to identify user-made gestures to control a remote device images the user using a three-dimensional image system, and defines at least one user-centric three-dimensional detection zone dynamically sized appropriately for the user, who is free to move about. Images made within the detection zone are compared to a library of stored gestures, and the thus identified gesture is mapped to an appropriate control command signal coupleable to the remote device. The method and system also provides of a first user to hand off control of the remote device to a second user.12-01-2011
20110296352ACTIVE CALIBRATION OF A NATURAL USER INTERFACE - A system and method are disclosed for periodically calibrating a user interface in a NUI system by performing periodic active calibration events. The system includes a capture device for capturing position data relating to objects in a field of view of the capture device, a display and a computing environment for receiving image data from the capture device and for running applications. The system further includes a user interface controlled by the computing environment and operating in part by mapping a position of a pointing object to a position of an object displayed on the display. The computing environment periodically recalibrates the mapping of the user interface while the computing environment is running an application.12-01-2011
20100058247METHODS AND SYSTEMS OF A USER INTERFACE - One embodiment of the application provides a method including segmenting a 3D polygon mesh into a plurality of widgets, defining a state variable for each widget, defining a behavior for each widget, and assembling a three-dimensional user interface from the widgets, the state variables, and the behaviors.03-04-2010
20100169838ANALYSIS OF IMAGES LOCATED WITHIN THREE-DIMENSIONAL ENVIRONMENTS - Images are analyzed within a 3D environment that is generated based on spatial relationships of the images and that allows users to experience the images in the 3D environment. Image analysis may include ranking images based on user viewing information, such as the number of users who have viewed an image and how long an image was viewed. Image analysis may further include analyzing the spatial density of images within a 3D environment to determine points of user interest.07-01-2010
20100169837Providing Web Content in the Context of a Virtual Environment - Information URLs may be associated with three dimensional objects in a three dimensional virtual environment. When a URL is selected, an overlay web rendering engine renders a web page associated with the URL over the object in the three dimensional virtual environment. The web page may include rich content, interactive content, or any other type of web content supported by the user's local browser and browser plugging. The user may interact with the content in the overlay web rendering engine to obtain successive layers of content or to affect the object in the virtual environment. The web page is rendered with a transparent background so that the three dimensional content of the virtual environment continues to be visible through the web page and provides context for the overlayed content. Information URLs may be used to provide information about objects, Avatars, or the virtual environment itself.07-01-2010
20110126160METHOD OF PROVIDING 3D IMAGE AND 3D DISPLAY APPARATUS USING THE SAME - A method of providing a three-dimensional (3D) image and a 3D display apparatus applying the same are provided. If a predetermined instruction is input in 2D mode, display mode is changed to 3D mode. A predetermined format is applied to an incoming image, and the resultant image is displayed in 3D mode. If the predetermined instruction is input again in 3D mode, another format is applied to the incoming image and the resultant image is displayed. As a result, a viewer can conveniently select a 3D image format of the incoming image.05-26-2011
20090282369System and Method for Muulti-Dimensional Organization, Management, and Manipulation of Remote Data - The Quantum Matrix system is a multi-dimensional, multi-threaded exchange environment for data organization, management, and manipulation. Data is organized into a multi-dimensional structure of nodes. Nodes may represent data, absence of data, or another set of nodes. The multi-dimensional structure or portions of it can be automatically created from a file system. One or more associations are also defined for the multi-dimensional structure. An association indicates a relationship between a node and another node, data, or a set of nodes. The multi-dimensional structure is then displayed three-dimensionally and navigated. Relational logic, Boolean algebra, or a scripting language can be applied to the nodes, data, and associations to produce a resultant set of nodes. Furthermore, portions of the multi-dimensional structure can be isolated with the use of planes to ease navigation. Furthermore, the Quantum Matrix system may have a client-server architecture, with the client running on a mobile device.11-12-2009
20090094556USER DEFINED SCENARIOS IN A THREE DIMENSIONAL GEO-SPATIAL SYSTEM - A method, apparatus, and article of manufacture provide the ability to store user defined scenarios in a three-dimensional system. A 3D view of a real world scene is displayed, using a three-dimensional (3D) graphics application. Plug-ins are installed into the 3D graphics application. A user selects a subset of the plug-ins, defines settings for the subset of plug-ins, and defines a visualization trait for each plug-in in the subset. The user associates an identification of the selected subset, the settings, and the visualization trait with a scenario bookmark that is saved. The bookmark can be selected by a user to display a visualization of a scenario based on the selected subset, settings, and visualization trait.04-09-2009
20120297345Three-Dimensional Animation for Providing Access to Applications - A three-dimensional animation for providing access to applications is described. In some implementations, a three-dimensional multi-level dock is displayed. The multi-level dock can be animated to appear to slide into view on a graphical user interface in response to user input. The levels of the multi-level dock can be configured to display selectable graphical objects representing applications available on a computing device. A user can select a graphical object to invoke a corresponding application. The three-dimensional multi-level dock can be animated to slide out of view on the graphical user interface in response to the selection of an application object or in response to other user input.11-22-2012
20090265667Techniques for Providing Three-Dimensional Virtual-World Presentations - A technique for providing a three-dimensional (3D) virtual-world (VW) presentation includes selecting a 3D real-world (RW) presentation. One or more messages, including 3D VW presentation steps that are associated with the 3D RW presentation, are then received at a VW presentation object that includes a VW presentation root script. The one or more messages are passed from the VW presentation root script to VW relay scripts (RSs) included in respective VW presentation objects associated with the 3D VW presentation. The one or more messages are then broadcast from the VW RSs to VW presentation execution scripts (PESs) that are associated with the 3D VW presentation. Finally, the 3D VW presentation is provided based on executed ones of the VW PESs.10-22-2009
20080288893Method and Device for Visualization of Information - The invention relates to a method of visualizing information about objects for the benefit of a user. The method comprises providing a substantially 2-D space with a plurality of separate locations; providing a set of objects with a plurality of objects; linking each of the objects in the 2-D space to an associated location in the 2-D space; and representing a 3-D virtual environment which is defined by the 2-D space; and at least accentuating and representing objects on the basis of altitudes in the 3-D virtual environment relative to the 2-D space.11-20-2008
20080295035Projection of visual elements and graphical elements in a 3D UI - A method includes determining user interface data based at least on a projection of visual element information from a projector at a first location in a three-dimensional space onto a screen object defined in the space. The screen object forms at least a portion of a user interface. The method includes determining an area of the screen object viewable by a camera positioned in a second location in the three dimensional space, and communicating the user interface data corresponding to the area to a display interface suitable for coupling to one or more displays. Apparatus and computer-readable media are also disclosed.11-27-2008
20080270945Interior spaces in a geo-spatial environment - A method, apparatus, and system of interior spaces in a geo-spatial environment are disclosed. In one embodiment, the method includes providing a plurality of user profiles, each user profile in the plurality of user profiles including an associated specific geographic location, selecting a user profile in the plurality of user profiles, generating a first virtual interior space view of a first structure associated with a first specific geographic location of the selected user profile in the plurality of user profiles, and generating, with the first virtual interior space view, at least one wiki profile associated with the first virtual interior space view. The method may also include generating, with the at least one wiki profile, content appended to the at least one wiki profile. Furthermore, the method may include capturing a second virtual interior space view of a second structure associated with a second specific geographic location.10-30-2008
20080270944METHOD AND SYSTEM FOR CROSS-SCREEN COMPONENT COMMUNICATION IN DYNAMICALLY CREATED COMPOSITE APPLICATIONS - A method and system for cross-screen component communication in dynamically created composite applications. Meta-data in the mark-up for a source component (e.g. eXtensible Markup Language—XML information) in a dynamically created composite application includes indications of which screens target components are located on. These indications are contained in definitions of logical connections established between components referred to as “cross-page wire” definitions. Executable objects, referred to as “cross-page wire” executable objects, are generated based on the cross-page wire definitions in the source component mark-up. The cross-page wire executable objects are executed by a run-time entity, such as a “property broker” or the like, in response to a change in a property value for which the cross-page wire has been defined, in order to deliver a new value of that property to one or more target components located on screens different from the screen on which the source component is located.10-30-2008
20110231802ELECTRONIC DEVICE AND METHOD FOR PROVIDING USER INTERFACE THEREOF - An electronic device and a method for providing an User Interface (UI) are disclosed. The method for providing a User Interface (UI) in an electronic device according to the present invention, the method may includes receiving a request for provision of the UI, collecting information for configuring the requested UI, classifying the collected information according to a first criterion so as to generate a plurality of pages, hierarchizing the generated pages and arranging the layers according to a second criterion so as to form a multilayer UI and providing the formed multilayer UI as the requested UI.09-22-2011
20080307366REFLECTIONS IN A MULTIDIMENSIONAL USER INTERFACE ENVIRONMENT - A graphical user interface has a back surface disposed from a viewing surface to define a depth. A visualization of receptacle is disposed between the back surface and a viewing surface and contains a visualization object. A reflection surface is defined such that a reflection of the visualization object is displayed on the reflection surface.12-11-2008
20100169836INTERFACE CUBE FOR MOBILE DEVICE - A computing device presents, on a screen, a three-dimensional rendering of an interface cube that includes a representation of a user's contacts displayed on at least one surface of the interface cube. The computing device receives a communication item from a peripheral application, where the communication item is associated with a particular contact of the user's contacts. The computing device creates a graphic based on the communication item and displays the graphic at a location on the representation of the user's contacts that corresponds to the location of the particular contact within a sequence of the user's contacts.07-01-2010
20090327969SEMANTIC ZOOM IN A VIRTUAL THREE-DIMENSIONAL GRAPHICAL USER INTERFACE - A GUI adapted for use with portable electronic devices such as media players is provided in which interactive objects are arranged in a virtual three-dimensional space (i.e., one represented on a two-dimensional display screen). The user manipulates controls on the player to maneuver through the 3-D space by zooming and steering to objects of interest which can represent various types of content, information or interactive experiences. The 3-D space mimics real space in that close objects appear larger to user while distant objects appear smaller. The close objects will typically represent higher level content, information, or interactive experiences while the distant objects represent more detailed content, information, or experiences. This GUI navigation feature, referred to as a semantic zoom, makes it easy for the user to maintain a clear understanding of his location within the 3-D space at all times.12-31-2009
20090113348METHOD AND APPARATUS FOR A USER INTERFACE WITH PRIORITY DATA - A system and corresponding method for providing a 3-dimensional (3-D) user interface displays images in a 3-D coordinate system. The method includes receiving user data input information. The method also compares the user data input information to frequently used terms and generates priority information based on the comparison. The generated priority information is displayed as holographic images in a 3-D coordinate system. Sensors are configured to sense user interaction within the 3-D coordinate system, so that a processor may receive user interaction information including the selected priority information from the sensors. The sensors are able to provide information to the processor that enables the processor to correlate user interaction with images in the 3-D coordinate system. The system may be used for interconnecting or communicating between two or more components connected to an interconnection medium (e.g., a bus) within a single computer or digital data processing system.04-30-2009
20110119631METHOD AND APPARATUS FOR OPERATING USER INTERFACE BASED ON USER'S VISUAL PERSPECTIVE IN ELECTRONIC DISPLAY DEVICE - A method and apparatus for operating a three-dimensional user interface in an electronic display device, according to a user's visual perspective from which a user looks at the device, are provided. In the method, the apparatus activates a three-dimensional mode in response to a user's request, and determines the user's visual perspective according to a predefined user's input received in the three-dimensional mode. Then the apparatus displays a user interface converted according to the user's visual perspective.05-19-2011
20130132909METHOD AND APPARATUS FOR DISPLAYING A POLYHEDRAL USER INTERFACE - Provided is a method and apparatus for providing a three-dimensional (3D) polyhedral user interface. A first polyhedron may be formed of a plurality of blocks which are mapped with a plurality of pieces of information. A user may manipulate rotation of some of the blocks to generate a second polyhedron.05-23-2013
20110035707STEREOSCOPIC DISPLAY DEVICE AND DISPLAY METHOD - A display control device and method to generate stereoscopic images in a graphical user interface (GUI) to be displayed on a display panel. A pop-out amount unit or operation calculates a pop-out amount indicating a perceived distance that at least a portion of the stereoscopic image pops out into space from the display panel, and a display controller or control operation controls display of the stereoscopic image with the calculated amount of pop-out on the display panel.02-10-2011
20110113382ACTIVITY TRIGGERED PHOTOGRAPHY IN METAVERSE APPLICATIONS - A system, method and program product for collecting image data from within a metaverse. A system is provided that includes: a graphical user interface (GUI) for allowing a user to install and administer a camera within the metaverse; a system for collecting image data from the camera based on an occurrence of a triggering event associated with the camera; and a system for storing or delivering the image data for the user.05-12-2011
20110126159GUI PROVIDING METHOD, AND DISPLAY APPARATUS AND 3D IMAGE PROVIDING SYSTEM USING THE SAME - A graphical user interface (GUI) providing method, a display apparatus and a three-dimensional (3D) image providing system using the same are provided. The GUI providing method includes: generating a first GUI for changing settings for a 3D image and a second GUI for changing an environment; and outputting the first GUI and the second GUI. Thus, the settings for the 3D image can be changed more easily and conveniently.05-26-2011
20100125812METHOD AND APPARATUS FOR MARKING A POSITION OF A REAL WORLD OBJECT IN A SEE-THROUGH DISPLAY - A method for marking a position of a real world object on a see-through display is provided. The method includes capturing an image of a real world object with an imaging device. A viewing angle and a distance to the object are determined. A real world position of the object is calculated based on the viewing angle to the object and the distance to the object. A location on the see-through display that corresponds to the real world position of the object is determined. A mark is then displayed on the see-through display at the location that corresponds to the real world object.05-20-2010
20120304128THREE-DIMENSIONAL MENU SYSTEM USING MANUAL OPERATION TOOLS - Disclosed is an augmented reality-based three-dimensional menu system using manual operation tools. According to the present invention, the three-dimensional menu system comprises: a display device; at least one pair of manual operation tools which are manually operated by the user, and are in a hexahedral shape; an image acquisition device which acquires images for the manual operation tools; and a menu augmentation unit which tracks the manual operation tools from the acquired images, and augments menu items in the vicinity of the manual operation tools of the acquired images, thereby outputting the augmented menu items to the display device.11-29-2012
20120304126THREE-DIMENSIONAL GESTURE CONTROLLED AVATAR CONFIGURATION INTERFACE - A method for controlling presentation to a user of a primary user experience of a software application is provided. The method includes displaying a third-person avatar in a 3D virtual scene that defines a user interface for controlling presentation of the primary user experience. The method further includes sensing controlling movements of the user within a physical space in which the user is located and causing display of controlled movements of the third-person avatar within the 3D virtual scene so that the controlled movements visually replicate the controlling movements. The method further includes detecting a predefined interaction of the third-person avatar with a user interface element displayed in the 3D virtual scene, and controlling presentation of the primary user experience in response to detecting the predefined interaction.11-29-2012
200802356283-D DISPLAY FOR TIME-BASED INFORMATION - A computer-implemented method of displaying information about first and second pluralities of time-based events, the method involving: displaying perspective representations of each of a plurality of timelines including a first timeline and a second timeline, wherein the perspective representation of the first timeline is made up of perspective images of representations of the events of the first plurality of events arrayed along the first timeline at locations in time corresponding to those events and the perspective representation of the second timeline is made up of perspective images of representations of the events of the second plurality of events arrayed along the second timeline at locations in time corresponding to those events; enabling a user to select a current time; and in response to the user selecting the current time, displaying perspective representations of a portion of each of the first and second timelines as determined by the user selected current time.09-25-2008
20110138336METHOD FOR DISPLAYING BROADCASTING DATA AND MOBILE TERMINAL THEREOF - A method for displaying broadcasting data and a mobile terminal thereof are discussed, wherein the method includes according to an embodiment: receiving broadcasting data and displaying the broadcasting data on a display of a mobile terminal, turning on a switching panel unit mounted on the display and displaying the broadcasting data in a 3-D image; generating a first proximity signal through a proximity sensor of the mobile terminal to enter a broadcasting data display change preparatory step; and changing the display of the broadcasting data responsive to a user detection signal generated by the proximity sensor or a touch sensor of the mobile terminal.06-09-2011
20090300551INTERACTIVE PHYSICAL ACTIVITY AND INFORMATION-IMPARTING SYSTEM AND METHOD - A method of imparting information includes interactively selectively displaying information to a person or user, based on the physical location of the person relative to a display screen upon which the information is displayed. A tracking system is operatively coupled to the display that selectively displays the information. The tracking system tracks the physical location of the person, and displays different information depending upon the physical location of the person. The display may include displays of virtual objects, such as cubes or other shapes. The view of the objects may be varied within the display as the user moves within physical space, varying the apparent position of the virtual objects as the user moves. The varying of the apparent position of the virtual objects may reveal information that was not visible to the user in other virtual positions (corresponding to other physical positions of the user).12-03-2009
20080270946Multidimensional Structured Data Visualization Method and Apparatus, Text Visualization Method and Apparatus, Method and Apparatus for Visualizing and Graphically Navigating the World Wide Web, Method and Apparatus for Visualizing Hierarchies - A method of displaying correlations among information objects includes receiving a query against a database; obtaining a query result set; and generating a visualization representing the components of the result set, the visualization including one of a plane and line to represent a data field, nodes representing data values, and links showing correlations among fields and values. Other visualization methods and apparatus are disclosed.10-30-2008
20120047466INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM - There is provided an information processing device including an acquisition section configured to acquire an operation vector based on a movement of a body part of a user, a correction section configured to correct a direction of the acquired operation vector, and a process execution section configured to execute a process in accordance with the corrected operation vector.02-23-2012
20120047465Information Processing Device, Information Processing Method, and Program - There is provided an information processing device including an acquisition section configured to acquire a curved movement of a body of a user as an operation, a display control section configured to display an object in a virtual three-dimensional space, and a process execution section configured to execute a process on the object based on the acquired operation. The object may be arranged on a first curved plane based on a virtual position of the user set in the virtual three-dimensional space, the first curved plane corresponding to the curved movement.02-23-2012
20120011474ANALYSIS OF COMPLEX DATA OBJECTS AND MULTIPLE PARAMETER SYSTEMS - A computer facilitates multiple parameters data analysis by special visualization and navigation methods. Data to be analyzed is loaded from an external source the computer displays the data in response to user input using a variety of methods including data tables, slices of data spaces, hierarchically navigated data spaces, dynamic slice tables, filters, sorting, color-mapping, numerical operations, and other methods.01-12-2012
20120023453Device, Method, and Graphical User Interface for Navigating Through a Hierarchy - A multifunction device displays a view of a top level of a hierarchical user interface. The hierarchical user interface has a plurality of levels including the top level and one or more lower levels. In response to detecting a first input, the device displays a view of at least one of the lower levels and at least a predefined portion of the view of the top level. While displaying a view of a respective lower level and concurrently displaying at least the predefined portion of the view of the top level, the device detects a second input. When the second input corresponds to a request to enter a content modification mode for the respective lower level, the device enters the content modification mode for the respective lower level and ceases to display the predefined portion of the view of the top level.01-26-2012
20120216149METHOD AND MOBILE APPARATUS FOR DISPLAYING AN AUGMENTED REALITY - A mobile apparatus and method for displaying an Augmented Reality (AR) in the mobile apparatus. The mobile apparatus captures an image of a current environment of the mobile apparatus, displays the image, detects mapping information corresponding to the current environment from among mapping information stored in the mobile apparatus, maps a three-dimensional (3D) Graphical User Interface (GUI) of detected mapping information onto the displayed image, based on a relative location relationship between the detected mapping information, and adjusts a display status of the 3D GUI, while maintaining the relative location relationship between the detected mapping information.08-23-2012
20120180000METHOD AND SYSTEM FOR SIMULATING THREE-DIMENSIONAL OPERATING INTERFACE - A method and a system for simulating a three-dimensional (3D) operating interface are provided. The method includes defining a partition line to partition a display frame of a screen into a first area and a second area, and defining a size of a unit grid to establish a first grid plane and a second grid plane in the first area and the second area respectively, the first grid plane and the second grid plane forming a simulated 3D grid space. The method also includes taking the unit grid as a unit to define an object size and an initial grid coordinate of an object. The initial grid coordinate is on one of the first and the second grid planes. The method further includes mapping out a simulated 3D space in the simulated 3D grid space for displaying the object according to the initial grid coordinate and the object size.07-12-2012
20120233573TECHNIQUES TO PRESENT HIERARCHICAL INFORMATION USING ORTHOGRAPHIC PROJECTIONS - Techniques to present hierarchical information as orthographic projections are described. An apparatus may comprise an orthographic projection application arranged to manage a three dimensional orthographic projection of hierarchical information. The orthographic projection application may comprise a hierarchical information component operative to receive hierarchical information representing multiple nodes at different hierarchical levels, and parse the hierarchical information into a tree data structure, an orthographic generator component operative to generate a graphical tile for each node, arrange graphical tiles for each hierarchical level into graphical layers, and arrange the graphical layers in a vertical stack, and an orthographic presentation component operative to present a three dimensional orthographic projection of the hierarchical information with the stack of graphical layers each having multiple graphical tiles. Other embodiments are described and claimed.09-13-2012
20110131536GENERATING AND RANKING INFORMATION UNITS INCLUDING DOCUMENTS ASSOCIATED WITH DOCUMENT ENVIRONMENTS - Embodiments described herein are directed to forming information units. Digital documents associated with collaborative navigation behavior information can be identified and an information unit can be generated using transition probabilities calculated from collaborative navigation information. The information unit including at least a subset of the digital documents identified in the collaborative navigation behavior information. A rank of information unit based on the collaborative navigation behavior information can be calculated.06-02-2011
20120240084Graphic User Interface for Interactions with Datasets and Databases and a Method to Manage Information Units - Methods for data visualisation, browsing, entry, modification, querying, processing, storage and transfer are described herein.09-20-2012
20090064051INTERACTIVE SYSTEM FOR VISUALIZATION AND RETRIEVAL OF VIDEO DATA - A cube-based three-dimensional interactive system is provided for visualization and retrieval of video data. The system includes at least one interactive cube having eight nodes, each node being linked to a specific data item. The data items on the nodes are organized in space or time. Textual information about a respective data item appears upon traversing the corresponding node and the data items opens up upon selection. The interactive cube can be expanded to include more than eight nodes, such as 12 or 18 nodes. The system can also have multiple cubes that are connected to form a one-level, multi-level or multi-dimensional hypercube03-05-2009
20120089949METHOD AND COMPUTING DEVICE IN A SYSTEM FOR MOTION DETECTION - A computing device in a system for motion detection comprises an image processing device to determine a motion of an object of interest, and a graphical user interface (GUI) module to drive a virtual role based on the motion determined by the image processing device. The image processing device comprises a foreground extracting module to extract a foreground image from each of a first image of the object of interest taken by a first camera and a second image of the object of interest taken by a second camera, a feature point detecting module to detect feature points in the foreground image, a depth calculating module to calculate the depth of each of the feature points based on disparity images associated with the each feature point, the depth calculating module and the feature point detecting module identifying a three-dimensional (3D) position of each of the feature points, and a motion matching module to identify vectors associated with the 3D positions of the feature points and determine a motion of the object of interest based on the vectors.04-12-2012
20120102435STEREOSCOPIC IMAGE REPRODUCTION DEVICE AND METHOD FOR PROVIDING 3D USER INTERFACE - A stereoscopic image reproduction device for providing a 3D user interface includes a UI generator which generates a user interface, a depth information processor which generates a 3D depth for the user interface, and a formatting unit which generates a 3D user interface for the user interface by using the 3D depth. The depth information processor may be integrated with the formatting unit. Various factors used to generate 3D depth perception include at least any one of blur, textual gradient, linear perspective, shading, color, brightness, and chroma, which results in a 3D-type user interface (UI) being shown on a stereoscopic image display.04-26-2012
20100138792NAVIGATING CONTENT - One embodiment of the invention involves a computer-implemented method in which information obtained from a uniform resource locator is converted into at least one texture. The texture is mapped onto a surface of a three-dimensional object located in the virtual three-dimensional space thereby forming a three-dimensional navigation mechanism.06-03-2010
20120151416CONTROLLING THREE-DIMENSIONAL VIEWS OF SELECTED PORTIONS OF CONTENT - Some embodiments of the inventive subject matter are directed to presenting a first portion of content and a second portion of content in a two-dimensional view via a graphical user interface and detecting an input associated with one or more of the first portion of the content and the second portion of the content. Some embodiments are further directed to selecting the first portion of the content in response to the detecting of the input, and changing the presenting of the first portion of the content from the two-dimensional view to a three-dimensional view in response to the selecting the first portion of the content. Some embodiments are further directed to continuing to present the second portion of the content in the two-dimensional view while changing the presenting of the first portion of the content to the three-dimensional view.06-14-2012
20130024819SYSTEMS AND METHODS FOR GESTURE-BASED CREATION OF INTERACTIVE HOTSPOTS IN A REAL WORLD ENVIRONMENT - Systems and methods provide for gesture-based creation of interactive hotspots in a real world environment. A gesture made by a user in a three-dimensional space in the real world environment is detected by a motion capture device such as a camera, and the gesture is then identified and interpreted to create a “hotspot,” which is a region in three-dimensional space through which a user interacts with a computer system. The gesture may indicate that the hotspot is anchored to the real world environment or anchored to an object in the real world environment. The functionality of the hotspot is defined in order to identify the type of gesture which will initiate the hotspot and associate the activation of the hotspot with an activity in the system, such as control of an application on a computer or an electronic device connected with the system.01-24-2013
20130174098METHOD AND RECORDED MEDIUM FOR PROVIDING 3D INFORMATION SERVICE - A method of providing a 3D information service at a user terminal includes: receiving a first request of a user for displaying information; and displaying information elements, which have different depths along the Z axis orthogonal to a screen (XY plane), by rotating the information elements about any one of the X axis and the Y axis, where the rotational axis of each of the information elements is set at different points on the YZ plane or the XZ plane. According to certain embodiments of the invention, the information elements on a screen may be shown as planar elements in a still screen for greater legibility, but when the information elements are in motion, such as for changing the screen or moving a content element, the motion is provided with differing speeds according to depth, thereby providing a sense of spatial perception unique to 3-dimensional images.07-04-2013
201102469503D MOBILE USER INTERFACE WITH CONFIGURABLE WORKSPACE MANAGEMENT - Systems and methods of a 3D mobile user interface with configurable workspace management are disclosed. In one aspect, embodiments of the present disclosure include a method, which may be implemented on a system, of a three-dimensional, multi-layer user interface of a mobile device in a mobile network. User environment may include one or more layers or levels of applications, services, or accounts that are all easily accessible to and navigable by the user. For example, an indicator can be used to access a workspace in 3D representing a category or grouping of services or applications for the user. The user can customize or create a unique, non-mutually exclusive grouping, aggregation, or category of applications, services, accounts, or items. The grouping of indicators can be used to swiftly and efficiently navigate to a desired application, service, account or item, in a 3D-enabled user environment.10-06-2011
20110246949Methods and System for Modifying Parameters of Three Dimensional Objects Subject to Physics Simulation and Assembly - A set of atomic three dimensional objects that can be joined together in a workspace to form one or more complex three dimensional objects, each atomic object includes one or more object join features parameterized to enable joining with one or more parameterized join features of another atomic object in the set of objects, and a shape that may be modified according to one or more parametrically defined constraint attributes. A user may reshape and or resize one or more of the atomic three dimensional objects prior to joining the three dimensional objects together at the appropriate parameterized join features to form one or more of the complex three dimensional objects.10-06-2011
20100287511METHOD AND DEVICE FOR ILLUSTRATING A VIRTUAL OBJECT IN A REAL ENVIRONMENT - The invention relates to a method for representing a virtual object in a real environment, having the following steps: generating a two-dimensional representation of a real environment by means of a recording device, ascertaining a position of the recording device relative to at least one component of the real environment, segmenting at least one area of the real environment in the two-dimensional image on the basis of non-manually generated 3D information for identifying at least one segment of the real environment in distinction to a remaining part of the real environment while supplying corresponding segmentation data, and merging the two-dimensional image of the real environment with the virtual object or, by means of an optical, semitransparent element, directly with reality with consideration of the segmentation data. The invention permits any collisions of virtual objects with real objects that occur upon merging with a real environment to be represented in a way largely close to reality.11-11-2010
20100287510ASSISTIVE GROUP SETTING MANAGEMENT IN A VIRTUAL WORLD - Systems, methods and articles of manufacture are disclosed for presenting a visual cue to a user in a virtual world. A cursor cycle allows the user to specify an avatar of focus by cycling through avatars in the virtual world. Visual cues of an avatar of focus are presented to the user. The user may define a cursor mask to include specific avatars. Visual cues of the cursor mask or of all avatars may be summarized and presented to the user. The user may also specify a threshold for a visual cue. A visual cue that is detected to exceed the specified threshold is presented to the user.11-11-2010
20100299640TRACKING IN A VIRTUAL WORLD - A status update of a real world entity is received. A previous status of a virtual world entity is transformed into a current status of the virtual world entity based on the status update of the real world entity. The virtual world entity may be part of a virtual world and may correspond to the real world entity in a real world. Further, the virtual world entity and the virtual world may be generated by a computer.11-25-2010
20120284670ANALYSIS OF COMPLEX DATA OBJECTS AND MULTIPLE PARAMETER SYSTEMS - A computer facilitates multiple parameters data analysis by special visualization and navigation methods. Data to be analyzed is loaded from an external source the computer displays the data in response to user input using a variety of methods including data tables, slices of data spaces, hierarchically navigated data spaces, dynamic slice tables, filters, sorting, color-mapping, numerical operations, and other methods.11-08-2012
20130198693THREE-DIMENSIONAL ANIMATION TECHNOLOGY FOR DESCRIBING AND MANIPULATING PLANT GROWTH - This disclosure concerns systems and methods for the prediction and physical three-dimensional representation of plant growth and development. In some embodiments, systems and/or methods of the disclosure may be used to represent the growth of a particular plant (e.g., a maize cultivar) under particular environmental conditions, and/or to represent the differences in growth characteristics between a particular plant and another plant.08-01-2013
20120304127Information Presentation in Virtual 3D - A method, system and program product for assisting a presentation owner in creating and presenting information to audience users in a virtual 3D cyclorama-like environment. A presentation object tool provides behavior in the cyclorama object to assist the presentation owner in resolving graphic objects into the cyclorama and in placing information onto the graphic objects. The presenter object tool also provides behavior in the graphic objects to allow the presentation owner to expand a graphic object into a larger viewing size, to increment and decrement the placement of graphic objects within the cyclorama's presentation space, and to place an expanded graphic object into a home viewing position for presentation to audience users.11-29-2012
20130091471VISUAL SEARCH AND THREE-DIMENSIONAL RESULTS - Methods, systems, graphical user interfaces, and computer-readable media for visually searching and exploring a set of objects are provided. A computer system executes a method that generates three-dimensional representations or two-dimensional representations for a set of objects in response to a user interaction with an interface that displays the three-dimensional representations or the two-dimensional representations. The interface includes filter controls, sorting controls, and classification controls, which are dynamically altered based on the content of a user query or the attributes of the objects in the three-dimensional representations or two-dimensional representations.04-11-2013
20130212536SYSTEM AND PROCESS FOR ROOF MEASUREMENT USING AERIAL IMAGERY - The present disclosure shows creating a first layer and a second layer, in computer memory and substantially overlapping at least a segment of line from said first layer with at least a segment of another line from said second layer. A first non-dimensional attribute is different from said second non-dimensional attribute of the two lines. A user length field enabling a client with said interactive file to override at least one of said length numeric values, where said area operator may automatically recalculate area based on said length field override is shown. Also, providing a visual marker that is moveable on said computer monitor around said aerial imagery region, which may be moved, to more precisely identify the location of the building roof structure is shown.08-15-2013

Patent applications in class Interface represented by 3D space

Patent applications in all subclasses Interface represented by 3D space