Patent application number | Description | Published |
20100073363 | SYSTEM AND METHOD FOR REAL-TIME ENVIRONMENT TRACKING AND COORDINATION - A configurable real-time environment tracking and command module (RTM) is provided to coordinate one or more than one devices or objects in a physical environment. A virtual environment is created to correlate with various objects and attributes within the physical environment. The RTM is able to receive data about attributes of physical objects and accordingly update the attributes of correlated virtual objects in the virtual environment. The RTM is also able to provide data extracted from the virtual environment to one or more than devices, such as robotic cameras, in real-time. An interface to the RTM allows multiple devices to interact with the RTM, thereby coordinating the devices. | 03-25-2010 |
20110219339 | System and Method for Visualizing Virtual Objects on a Mobile Device - A system and a method are provided for visualizing virtual objects on a mobile device. A computing device is in communication with the mobile device. The computing device generates a 3D virtual world of one or more virtual objects corresponding to one or more physical objects in a real world. The computing device then associates information with the one or more virtual objects and generates one or more static images based on the 3D virtual world. The mobile device receives the one or more static images and the associated information associated from the computing device, and then displays the one or more static images. | 09-08-2011 |
20130128054 | System and Method for Controlling Fixtures Based on Tracking Data - Systems and methods are provided for using tracking data to control the functions of an automated fixture. Examples of automated fixtures include light fixtures and camera fixtures. A method includes obtaining a first position of a tracking unit. The tracking unit includes an inertial measurement unit and a visual indicator configured to be tracked by a camera. A first distance is computed between the automated fixture and the first position and it is used to set a function of the automated fixture to a first setting. A second position of the tracking unit is obtained. A second distance between the automated fixture and the second position is computed, and the second distance is used to set the function of the automated fixture to a second setting. | 05-23-2013 |
20130135303 | System and Method for Visualizing a Virtual Environment Online - Systems and methods are provided to allow a user to visualize a 3D model of a venue and to customize the 3D model of the venue according to their own needs. A data abstraction of the 3D venue model is created and sent to the venue operator. This data abstraction can be used to reconstruct the 3D venue model in a 3D virtual environment software. The customized 3D venue model is generated by: displaying on a web browser a 3D venue model; displaying one or more virtual objects available in an objects library; customizing the 3D venue model by receiving an input to place a selected virtual object in the 3D venue model; receiving an input to save the customized 3D venue model; and generating a text file comprising a name of the 3D venue model and data describing one or more characteristics of the selected virtual object. | 05-30-2013 |
20130307934 | System and Method for Providing 3D Sound - Systems and methods are provided for associating position information and sound. The method includes obtaining position information of an object at a given time; obtaining position information of a camera at the given time; determining a relative position of the object relative to the camera's position; and associating sound information with the relative position of the object. In another aspect, the position and orientation of a microphone are also tracked to calibrate the sound produced by an object or person, and the calibrated sound is associated with the relative position of the object, that is relative to the camera. | 11-21-2013 |
20140119606 | System and Method for Real-Time Environment Tracking and Coordination - A configurable real-time environment tracking and command module (RTM) is provided to coordinate one or more than one devices or objects in a physical environment. A virtual environment is created to correlate with various objects and attributes within the physical environment. The RTM is able to receive data about attributes of physical objects and accordingly update the attributes of correlated virtual objects in the virtual environment. The RTM is also able to provide data extracted from the virtual environment to one or more than devices, such as robotic cameras, in real-time. An interface to the RTM allows multiple devices to interact with the RTM, thereby coordinating the devices. | 05-01-2014 |
20140176537 | System and Method for Visualizing Virtual Objects on a Mobile Device - A system and a method are provided for visualizing virtual objects on a mobile device. A computing device is in communication with the mobile device. The computing device generates a 3D virtual world of one or more virtual objects corresponding to one or more physical objects in a real world. The computing device then associates information with the one or more virtual objects and generates one or more static images based on the 3D virtual world. The mobile device receives the one or more static images and the associated information associated from the computing device, and then displays the one or more static images. | 06-26-2014 |
20140320667 | System and Method for Tracking - Systems and methods are provided for tracking at least position and angular orientation. The system comprises a computing device in communication with at least two cameras, wherein each of the cameras are able to capture images of one or more light sources attached to an object. A receiver is in communication with the computing device, wherein the receiver is able to receive at least angular orientation data associated with the object. The computing device determines the object's position by comparing images of the light sources and generates an output comprising the position and angular orientation of the object. | 10-30-2014 |