Inventors list

Assignees list

Classification tree browser

Top 100 Inventors

Top 100 Assignees


Vision sensor (e.g., camera, photocell)

Subclass of:

700 - Data processing: generic control systems or specific applications

700090000 - SPECIFIC APPLICATION, APPARATUS OR PROCESS

700245000 - Robot control

700258000 - Having particular sensor

Patent class list (only not empty are listed)

Deeper subclasses:

Entries
DocumentTitleDate
20130030571ROBOTIZED SURGERY SYSTEM WITH IMPROVED CONTROL - A robotized surgery system (01-31-2013
20130030570ROBOT DEVICE, METHOD OF CONTROLLING THE SAME, COMPUTER PROGRAM, AND ROBOT SYSTEM - Provided is a robot device including an image input unit for inputting an image of surroundings, a target object detection unit for detecting an object from the input image, an object position detection unit for detecting a position of the object, an environment information acquisition unit for acquiring surrounding environment information of the position of the object, an optimum posture acquisition unit for acquiring an optimum posture corresponding to the surrounding environment information for the object, an object posture detection unit for detecting a current posture of the object from the input image, an object posture comparison unit for comparing the current posture of the object to the optimum posture of the object, and an object posture correction unit for correcting the posture of the object when the object posture comparison unit determines that there is a predetermined difference or more between the current posture and the optimum posture.01-31-2013
20090198380METHODS FOR REAL-TIME AND NEAR REAL-TIME INTERACTIONS WITH ROBOTS THAT SERVICE A FACILITY - In accordance with aspects of the present invention, a service robot and methods for controlling such a robot are provided. In particular, the robot is configured to sense the presence of a person and to take a next action in response to sensing the presence of the person. As examples, the robot could leave the area, await commands from the person, or enter an idle or sleep state or mode until the person leaves.08-06-2009
20130085605ROBOT SYSTEM AND METHOD FOR PRODUCING A TO-BE-PROCESSED MATERIAL - A robot system includes a container, a disposed-state detector, and a robot arm. The container is configured to accommodate a plurality of to-be-held objects and includes a reticulated portion. The disposed-state detector is configured to detect disposed states of the plurality of respective to-be-held objects disposed in the container. The robot arm includes a holder configured to hold a to-be-held object among the plurality of to-be-held objects based on the disposed states of the plurality of respective to-be-held objects detected by the disposed-state detector.04-04-2013
20110196536LINE INSPECTION ROBOT AND SYSTEM - The present invention relates to an overhead transmission line inspection robot and system for inspecting transmission line components and right of way conditions. The line inspection robot includes at least one drive system for propelling the robot along a line, a platform adapted to pivot relative to the at least one drive system, and a control system adapted to control the robot.08-11-2011
20110196535LINE INSPECTION ROBOT AND SYSTEM - The present invention relates to an overhead transmission line inspection robot and system for inspecting transmission line components and right of way conditions. The overhead transmission line inspection robot includes a communications and control system adapted to control the robot and transmit information and a drive system for propelling the robot along a shield wire to enable inspection over a large area. The robot further includes a camera adapted to inspect right of way and component conditions; a light detection and ranging (LiDar) sensor adapted to measure conductor position, vegetation, and nearby structures; and a global positioning system adapted to identify the robot's position and speed.08-11-2011
20110196534APPARATUS AND METHOD FOR INSPECTION OF UNDERGROUND PIPES - A system for inspecting an underground conduit from within comprises a data acquisition subsystem configured to be placed within the conduit and to move along at least a portion of the conduit to obtain data regarding the conduit. The system comprises a data storage subsystem configured to be placed within the conduit and to move along the conduit. The data storage subsystem receives and stores at least a portion of the data from the data acquisition subsystem for retrieval after the data acquisition subsystem has moved along the conduit.08-11-2011
20100076600MOBILE ROBOT FOR TELECOMMUNICATION - A mobile robot provides telecommunication service between a remote user at a remote terminal and a local user in proximity to the mobile robot. The remote user can connect to the mobile robot via the Internet using a peer-to-peer VoIP protocol, and control the mobile robot to navigate about the mobile robot's environment. The mobile robot includes a microphone, a video camera and a speaker for providing telecommunication functionality between the remote user and the local user. Also, a hand-held RC unit permits the local user to navigate the mobile robot locally or to engage privacy mode for the mobile robot. When NAT or a firewall obstructs connection from the remote terminal to the mobile robot, an Internet server facilitates connection using methods such as STUN, TURN, or relaying.03-25-2010
20120265344ROBOT SYSTEM AND METHOD FOR OPERATING ROBOT SYSTEM - This robot system includes a first imaging portion detachably mounted to a robot arm and a control portion controlling the operation of the robot arm and a grasping portion, and the control portion is so formed as to detach the first imaging portion from the robot arm before moving an object to be grasped that is being grasped by the grasping portion to a prescribed processing position.10-18-2012
20130041508SYSTEMS AND METHODS FOR OPERATING ROBOTS USING VISUAL SERVOING - A system and method for providing intuitive, visual based remote control is disclosed. The system can comprise one or more cameras disposed on a remote vehicle. A visual servoing algorithm can be used to interpret the images from the one or more cameras to enable the user to provide visual based inputs. The visual servoing algorithm can then translate that commanded motion into the desired motion at the vehicle level. The system can provide correct output regardless of the relative position between the user and the vehicle and does not require any previous knowledge of the target location or vehicle kinematics.02-14-2013
20100100240TELEPRESENCE ROBOT WITH A CAMERA BOOM - A remote controlled robot with a head that supports a monitor and is coupled to a mobile platform. The mobile robot also includes an auxiliary camera coupled to the mobile platform by a boom. The mobile robot is controlled by a remote control station. By way of example, the robot can be remotely moved about an operating room. The auxiliary camera extends from the boom so that it provides a relatively close view of a patient or other item in the room. An assistant in the operating room may move the boom and the camera. The boom may be connected to a robot head that can be remotely moved by the remote control station.04-22-2010
20090157228USER INTERFACE DEVICE OF REMOTE CONTROL SYSTEM FOR ROBOT DEVICE AND METHOD USING THE SAME - A user interface device of a remote control system for a robot and a method using the same are provided. The user interface device includes: a radio frequency (RF) unit for receiving, from a remote control robot, camera data and at least one sensor data detecting a distance; a display unit having a main screen and at least one auxiliary screen; and a controller having an environment evaluation module for determining whether the received camera data are in a normal condition, and having a screen display mode change module for displaying, if the received camera data are in a normal condition, the camera data on the main screen and displaying, if the received camera data are in an abnormal condition, the sensor data on the main screen.06-18-2009
20120185094Mobile Human Interface Robot - A mobile robot that includes a drive system, a controller in communication with the drive system, and a volumetric point cloud imaging device supported above the drive system at a height of greater than about one feet above the ground and directed to be capable of obtaining a point cloud from a volume of space that includes a floor plane in a direction of movement of the mobile robot. The controller receives point cloud signals from the imaging device and issues drive commands to the drive system based at least in part on the received point cloud signals.07-19-2012
20120185093ROBOT MOUNTING DEVICE - A robot mounting device includes a pair of spaced-apart arms adapted to retain robot body of a surveillance robot. The robot mounting device also includes a latching mechanism to secure the robot mounting device to a rifle. The positioning of the robot can be adjusted within robot mounting device to site a camera in the axle of the robot with respect to the rifle. The rifle can then be oriented to obtain visual imagery of an environment.07-19-2012
20120185097CONTROL COMPUTER AND METHOD OF CONTROLLING ROBOTIC ARM - A computer determines a first origin of a first coordinate system of a PCB, and controls a robotic arm to position a probe above the first origin. Furthermore, the computer determines a second origin of a second coordinate system of the robotic arm, and determines displacement values from the first origin to a test point in controlling movements of the robotic arm in the second coordinate system. A graph representing the test point is recognized in an image of the PCB, pixel value differences between the graph center and the image center are determined and converted to displacement correction values for controlling the movements of the robotic arm and determining 3D coordinates of the test point. The robotic arm is moved along a Z-axis of the second coordinate system to precisely position the probe on the test point of the PCB.07-19-2012
20120185096Operating a Mobile Robot - A method of operating a mobile robot to traverse a threshold includes detecting a threshold proximate the robot. The robot includes a holonomic drive system having first, second, and third drive elements configured to maneuver the robot omni-directionally. The method further includes moving the first drive element onto the threshold from a first side and moving the second drive element onto the threshold to place both the first and second drive elements on the threshold. The method includes moving the first drive element off a second side of the threshold, opposite to the first side of the threshold, and moving the third drive element onto the threshold, placing both the second and third drive elements on the threshold. The method includes moving both the second and third drive elements off the second side of the threshold.07-19-2012
20120185095Mobile Human Interface Robot - A mobile human interface robot that includes a base defining a vertical center axis and a forward drive direction and a holonomic drive system supported by the base. The drive system has first, second, and third driven drive wheels, each trilaterally spaced about the vertical center axis and having a drive direction perpendicular to a radial axis with respect to the vertical center axis. The robot further includes a controller in communication with the holonomic drive system, a torso supported above the base, and a touch sensor system in communication with the controller. The touch sensor system is responsive to human contact. The controller issues drive commands to the holonomic drive system based on a touch signal received from the touch sensor system.07-19-2012
20120165985MAINTAINING A WIND TURBINE WITH A MAINTENANCE ROBOT - The present invention relates to a wind turbine maintenance system and a method of maintenance therein. A wind turbine maintenance system is provided, for carrying out a maintenance task in a nacelle of a wind turbine, comprising a maintenance robot, further comprising a detection unit, for identifying a fault in a sub-system in the nacelle and generating fault information, a processor unit, adapted to receive fault information from the detection unit and control the maintenance robot to perform a maintenance task, a manipulation arm to perform the maintenance task on the identified sub-system. In another aspect, a method of carrying out a maintenance task in a wind turbine is provided.06-28-2012
20090319083Robot Confinement - A method of confining a robot in a work space includes providing a portable barrier signal transmitting device including a primary emitter emitting a confinement beam primarily along an axis defining a directed barrier. A mobile robot including a detector, a drive motor and a control unit controlling the drive motor is caused to avoid the directed barrier upon detection by the detector on the robot. The detector on the robot has an omnidirectional field of view parallel to the plane of movement of the robot. The detector receives confinement light beams substantially in a plane at the height of the field of view while blocking or rejecting confinement light beams substantially above or substantially below the plane at the height of the field of view.12-24-2009
20100274391DETERMINING THE POSITION OF AN OBJECT - A method for determining the position of at least one object present within a working range of a robot by an evaluation system, wherein an image of at least one part of the working range of the robot is generated by a camera mounted on a robot. The image is generated during a motion of the camera and image data are fed to the evaluation system in real time, together with further data, from which the position and/or orientation of the camera when generating the image can be derived. The data are used for determining the position of the at least one object.10-28-2010
20120191246MOBILE TELE-PRESENCE SYSTEM WITH A MICROPHONE SYSTEM - A remote controlled robot system that includes a robot and a remote control station. The robot includes a binaural microphone system that is coupled to a speaker system of the remote control station. The binaural microphone system may include a pair of microphones located at opposite sides of a robot head. the location of the microphones roughly coincides with the location of ears on a human body. Such microphone location creates a mobile robot that more effectively simulates the tele-presence of an operator of the system. The robot may include two different microphone systems and the ability to switch between systems. For example, the robot may also include a zoom camera system and a directional microphone. The directional microphone may be utilized to capture sound from a direction that corresponds to an object zoomed upon by the camera system.07-26-2012
20130073090ROBOT SYSTEM - A robot system includes: a projecting unit for projecting a slit light on a specified placement region and moving the slit light in a specified direction; an imaging unit for imaging the slit light moving on a work on the placement region; an estimated projection region determining unit for determining an estimated projection region such that the length of the estimated projection region in a direction substantially parallel to the moving direction grows larger toward the center of the image in the intersection direction; a projection position detecting unit for detecting a projection position of the slit light within the estimated projection region. The robot system further includes a robot for gripping the workpiece.03-21-2013
20130073089ROBOT SYSTEM AND IMAGING METHOD - A robot system includes: an imaging unit including an imaging device and a distance measuring part; and a robot to which the imaging unit is attached. The imaging device preliminarily images a workpiece. The robot preliminarily moves the imaging unit based on the result of the preliminary imaging. The distance measuring part measures the distance to the workpiece. The robot actually moves the imaging unit based on the result of the measurement. The imaging device actually images the workpiece.03-21-2013
20130073091ROBOT CONTROL APPARATUS AND ROBOT SYSTEM - A robot control apparatus, which controls motions of an industrial robot based on processing results of an image processing apparatus which images the robot or objects around the robot, includes: a first communication unit which communicates with a computer for development as an external computer; a second communication unit which is connected to the image processing apparatus via a network; and a command processing unit which opens a communication port of the second communication unit and causes the second communication unit to start communication with the image processing apparatus via a server on the network in response to an open command received by the first communication unit.03-21-2013
20130073088MOBILE ROBOT AND CONTROLLING METHOD OF THE SAME - In a mobile robot and a controlling method of the same, the mobile robot is able to recognize a precise position thereof by detecting a plurality of images through an image detection unit, extracting one or more feature points from the plurality of images, and comparing and matching information related to the feature points. The mobile robot is also able to easily detect a position of a charging station based on image information, and quickly move to the charging station upon the lack of residual battery capacity. The mobile robot is also able to detect a position of the charging station based on the image information and receive a guideline signal within a signal reception range, so as to easily dock with the charging station.03-21-2013
20130073087SYSTEM FOR CONTROLLING ROBOTIC CHARACTERS TO ENHANCE PHOTOGRAPHIC RESULTS - A method for controlling a robotic apparatus to produce desirable photographic results. The method includes, with a motor controller, first operating a robotics assembly to animate the robotic apparatus and, then, detecting an upcoming image capture. The method further includes, with the motor controller in response to the detecting of the upcoming image capture, second operating the robotics assembly to pose the robotic apparatus for the upcoming image capture. In some embodiments, the detecting includes a sensor mounted on the robotic apparatus sensing a pre-flash of light from a red-eye effect reduction mechanism of a camera. In other cases, the detecting includes a sensor mounted on the robotics apparatus sensing a range finder signal from a range finder of a camera. The posing may include opening eyes, moving a mouth into a smile, or otherwise striking a pose that is held temporarily to facilitate image capture with a camera.03-21-2013
20090271038SYSTEM AND METHOD FOR MOTION CONTROL OF HUMANOID ROBOT - A system and method for motion control of a humanoid robot are provided. The system includes a remote controller for recognizing three-dimensional image information including two-dimensional information and distance information of a user, determining first and second reference points on the basis of the three-dimensional image information, calculating variation in angle of a joint on the basis of three-dimensional coordinates of the first and second reference points, and transmitting a joint control signal through a wired/wireless network. The system also includes a robot for checking joint control data from the joint control signal received from the remote controller and varying an angle of the joint to move according to the user's motion.10-29-2009
20130066469MOBILE VIDEOCONFERENCING ROBOT SYSTEM WITH NETWORK ADAPTIVE DRIVING - A remote control station that controls a robot through a network. The remote control station transmits a robot control command that includes information to move the robot. The remote control station monitors at least one network parameter and scales the robot control command as a function of the network parameter. For example, the remote control station can monitor network latency and scale the robot control command to slow down the robot with an increase in the latency of the network. Such an approach can reduce the amount of overshoot or overcorrection by a user driving the robot.03-14-2013
20110022231Apparatuses, Systems and Methods for Automated Crop Picking - Automated apparatuses and related methods for scanning, spraying, pruning, and harvesting crops from plant canopies. The apparatuses include a support structure comprising a frame, a central vertical shaft, and at least one module support member capable of rotating around a plant canopy. The support member supports a plurality of movable arms, each arm having at least one detector for probing the plant canopy. Embodiments further comprise applicators and/or manipulators for spraying, pruning, and harvesting crops from within the plant canopy. The methods of the present invention include causing the moveable arms attached to the support structure to be extended into the plant canopy, searching for crops, and transmitting and/or storing the search data. Embodiments further comprise detaching crops from the plant canopy and transporting them to a receptacle, applying a controlled amount of material within the plant canopy, or pruning inside of the plant canopy.01-27-2011
20120116588ROBOT SYSTEM AND CONTROL METHOD THEREOF - A robot system and a control method thereof in which, when a robot is located in a docking region, the robot calculates a distance by emitting infrared rays and detecting ultrasonic waves oscillated from a charging station, measures a distance from the charging station and performs docking with charging station. The distance between the robot and the charging station is precisely measured, thereby performing smooth and correct docking of the robot with the charging station. Further, the robot emits infrared rays only while performing docking with the charging station and thus reduces power consumption required for infrared ray emission, and wakes up a circuit in the charging station based on the infrared rays emitted from the robot and thus reduces power consumption of the charging station.05-10-2012
20130166070OBTAINING FORCE INFORMATION IN A MINIMALLY INVASIVE SURGICAL PROCEDURE - Methods of and a system for providing force information for a robotic surgical system. The method includes storing first kinematic position information and first actual position information for a first position of an end effector; moving the end effector via the robotic surgical system from the first position to a second position; storing second kinematic position information and second actual position information for the second position; and providing force information regarding force applied to the end effector at the second position utilizing the first actual position information, the second actual position information, the first kinematic position information, and the second kinematic position information. Visual force feedback is also provided via superimposing an estimated position of an end effector without force over an image of the actual position of the end effector. Similarly, tissue elasticity visual displays may be shown.06-27-2013
20120239196Natural Human to Robot Remote Control - The subject disclosure is directed towards controlling a robot based upon sensing a user's natural and intuitive movements and expressions. User movements and/or facial expressions are captured by an image and depth camera, resulting in skeletal data and/or image data that is used to control a robot's operation, e.g., in a real time, remote (e.g., over the Internet) telepresence session. Robot components that may be controlled include robot “expressions” (e.g., audiovisual data output by the robot), robot head movements, robot mobility drive operations (e.g., to propel and/or turn the robot), and robot manipulator operations, e.g., an arm-like mechanism and/or hand-like mechanism.09-20-2012
20100063629SYSTEM AND METHOD FOR RECIRCULATING PARTS - This invention relates to a system and method for feeding and recirculating parts for vision-based pickup. The system and method have a feeder that automatically recirculates parts that are not picked by a robot. The system has a feeder bowl, ramp and interchangeable picking plate, all of which may be vibrated to both feed parts and cause recirculation.03-11-2010
20110172822Companion Robot for Personal Interaction - A mobile robot guest for interacting with a human resident performs a room-traversing search procedure prior to interacting with the resident, and may verbally query whether the resident being sought is present. Upon finding the resident, the mobile robot may facilitate a teleconferencing session with a remote third party, or interact with the resident in a number of ways. For example, the robot may carry on a dialogue with the resident, reinforce compliance with medication or other schedules, etc. In addition, the robot incorporates safety features for preventing collisions with the resident; and the robot may audibly announce and/or visibly indicate its presence in order to avoid becoming a dangerous obstacle. Furthermore, the mobile robot behaves in accordance with an integral privacy policy, such that any sensor recording or transmission must be approved by the resident.07-14-2011
20120221144Disruptor Guidance System and Methods Based on Scatter Imaging - A system and method for guiding a disruptor robot in disruption of an explosive device. The system includes a source of penetrating radiation, having a coordinated position on the robot with respect to a disrupter coupled to robot, and at least one detector for detecting radiation produced by the source and scattered by the explosive device. An analyzer produces an image of the explosive device and facilitates identification of a disruption target of the explosive device. A controller positions the disruptor with respect to the explosive device so that the disruptor is aimed at the disruption target.08-30-2012
20100324735METHOD AND DEVICE FOR FINE POSITIONING OF A TOOL HAVING A HANDLING APPARATUS - The present invention relates to a method and a device for the machining of an object using a tool, in which the tool (12-23-2010
20090118864METHOD AND SYSTEM FOR FINDING A TOOL CENTER POINT FOR A ROBOT USING AN EXTERNAL CAMERA - Disclosed is a method and system for finding a relationship between a tool-frame of a tool attached at a wrist of a robot and robot kinematics of the robot using an external camera. The position and orientation of the wrist of the robot define a wrist-frame for the robot that is known. The relationship of the tool-frame and/or the Tool Center Point (TCP) of the tool is initially unknown. For an embodiment, the camera captures an image of the tool. An appropriate point on the image is designated as the TCP of the tool. The robot is moved such that the wrist is placed into a plurality of poses. Each pose of the plurality of poses is constrained such that the TCP point on the image falls within a specified geometric constraint (e.g. a point or a line). A TCP of the tool relative to the wrist frame of the robot is calculated as a function of the specified geometric constraint and as a function of the position and orientation of the wrist for each pose of the plurality of poses. An embodiment may define the tool-frame relative to the wrist frame as the calculated TCP relative to the wrist frame. Other embodiments may further refine the calibration of the tool-frame to account for tool orientation and possibly for a tool operation direction. An embodiment may calibrate the camera using a simplified extrinsic technique that obtains the extrinsic parameters of the calibration, but not other calibration parameters.05-07-2009
20090265036ROBOT OPERATOR CONTROL UNIT CONFIGURATION SYSTEM AND METHOD - A unified framework is provided for building common functionality into diverse operator control units. A set of tools is provided for creating controller configurations for varied robot types. Preferred controllers do one or more the following: allow uploading of configuration files from a target robot, adhere to common user interface styles and standards, share common functionality, allow extendibility for unique functionality, provide flexibility for rapid prototype design, and allow dynamic communication protocol switching. Configuration files may be uploaded from robots to configure their operator control units. The files may include scene graph control definitions; instrument graphics; control protocols; or mappings of control functions to scene graphics or control inputs.10-22-2009
20110301758METHOD OF CONTROLLING ROBOT ARM12-08-2011
20110301759GRAPHICAL INTERFACE FOR A REMOTE PRESENCE SYSTEM - A robot system that includes a robot and a remote station. The remote station may be a personal computer coupled to the robot through a broadband network. A user at the remote station may receive both video and audio from a camera and microphone of the robot, respectively. The remote station may include a display user interface that has a variety of viewable fields and selectable buttons.12-08-2011
20090118865ROBOT - Described herein is a robot having a camera mount that is movable along a curved surface of an upper part of the head, on a front side thereof, and two units of cameras that are mounted in the camera mount. The cameras are disposed such that the cameras are laterally separated from each other at an interval substantially equivalent to a lateral width of the head, respective extremities of the cameras are substantially flush with the front end of the head, and the cameras are positioned in close proximity of the upper end of the robot.05-07-2009
20110288682Telepresence Robot System that can be Accessed by a Cellular Phone - A robot system with a robot that has a camera, monitor, a microphone and a speaker. A communication link can be established with the robot through a cellular phone. The link may include an audio only communication. Alternatively, the link may include audio and video communication between the cellular phone and the robot. The phone can transmit its resolution to the robot and cause the robot to transmit captured images at the phone resolution. The user can cause the robot to move through input on the cellular phone. For example, the phone may include an accelerometer that senses movement, and movement commands are then sent to the robot to cause a corresponding robot movement. The phone may have a touch screen that can be manipulated by the user to cause robot movement and/or camera zoom.11-24-2011
20110190936Portable Power Tool - The present invention relates to a portable power tool (08-04-2011
20120109378ROBOT REFRIGERATOR AND SYSTEM HAVING THE SAME - Disclosed are a robot refrigerator and a robot refrigerator system. The robot refrigerator can be remotely controlled. The robot refrigerator generates image information from a surrounding image and transmits the generated image information to a wireless communication device. Then, the wireless communication device remotely controls the robot refrigerator, or monitors or remotely controls the robot refrigerator in real time, so that the robot refrigerator can easily avoid an obstacle to thus minimize a movement time of the robot refrigerator. Thus, user convenience and system reliability can be improved.05-03-2012
20120109377AUTOFOCUS AND/OR AUTOSCALING IN TELESURGERY - Robotic, telerobotic, and/or telesurgical devices, systems, and methods take advantage of robotic structures and data to calculate changes in the focus of an image capture device in response to movement of the image capture device, a robotic end effector, or the like. As the size of an image of an object shown in the display device varies with changes in a separation distance between that object and the image capture device used to capture the image, a scale factor between a movement command input may be changed in response to moving an input device or a corresponding master/slave robotic movement command of the system. This may enhance the perceived correlation between the input commands and the robotic movements as they appear in the image presented to the system operator.05-03-2012
20100100241AUTONOMOUS FOOD AND BEVERAGE DISTRIBUTION MACHINE - The invention proposes an autonomous mobile robotic device in the form of an integrated machine for producing beverages or liquid comestibles.04-22-2010
20090177323COMPANION ROBOT FOR PERSONAL INTERACTION - A mobile robot guest for interacting with a human resident performs a room-traversing search procedure prior to interacting with the resident, and may verbally query whether the resident being sought is present. Upon finding the resident, the mobile robot may facilitate a teleconferencing session with a remote third party, or interact with the resident in a number of ways. For example, the robot may carry on a dialogue with the resident, reinforce compliance with medication or other schedules, etc. In addition, the robot incorporates safety features for preventing collisions with the resident; and the robot may audibly announce and/or visibly indicate its presence in order to avoid becoming a dangerous obstacle. Furthermore, the mobile robot behaves in accordance with an integral privacy policy, such that any sensor recording or transmission must be approved by the resident.07-09-2009
20120296473ROBOT ARM AND DETECTING DEVICE HAVING SAME - A robot arm includes a support arm, an adjusting rod and a detecting unit. The adjusting rod rotatably extends through the support arm. The detecting unit is attached to the adjusting rod. The detecting unit includes an image capture device and a probe device. The image capture device captures images of a workpiece. The probe device includes a driving device and a probe. The driving device may drive the probe to move between a first position where the probe does not visually prevent images of the workpiece being captured by the image capture device, and a second position where the probe does block the images of the workpiece being captured by the image capture device.11-22-2012
20090287353SYSTEM AND METHOD FOR CONTROLLING A BIPEDAL ROBOT VIA A COMMUNICATION DEVICE - A system for controlling a bipedal robot via a communication device. The system acquires a mapping data and a current location of the bipedal robot via a Global Positioning System (GPS), determines a route on the mapping data, and directs movement of the bipedal robot until it reaches a preset destination. A method for controlling the robot and a storage device containing computer instructions for execution of the method are also provided.11-19-2009
20110172821AUTOMATED TIRE INFLATION SYSTEM - A system and method for automatically inflating tires mounted on a vehicle without requiring the occupants of the vehicle from leaving the interior of the vehicle. In one aspect, the present invention is directed to a system for automatically inflating a tire of a vehicle. The system determines a location of a valve stem of the tire and a robotic arm for inflating the tire. The robotic arm attaches to the located valve stem based on the determined valve stem location. The robotic arm supplies air to the tire from an air supply. The valve stem is located and the robotic arm attaches to the valve stem based on the determined valve stem location, inflates the tire and detaches from the tire upon determining that a predetermined tire air pressure is attained.07-14-2011
20100292840FLEXIBLE TWO-WHEELED SELF-BALANCING ROBOT SYSTEM AND ITS MOTION CONTROL METHOD - A flexible two-wheeled self-balancing robot system and its motion control method include a main controller 11-18-2010
20110208358APPARATUS FOR SPLASH ZONE OPERATIONS - System for maintenance and inspection of structures located in hard to reach places, using a remote controlled arm that consists of arrangement for fixing said remote controlled arm to the structure, said remote controlled arm consists of at least two joints, said remote controlled arm has the ability to change working equipment, said remote controlled arm has a camera, said remote controlled arm is controlled from a control centre.08-25-2011
20130218344CONTROL METHOD FOR CLEANING ROBOTS - An embodiment of the invention provides a control method of a cleaning robot. The method includes steps of moving the cleaning robot according to a first direction; keeping moving the cleaning robot according to the first direction when a light detector of the cleaning robot detects a light beam; moving the cleaning robot for a predetermined distance and then stopping the cleaning robot when the light detector does not detect the light beam; and moving the cleaning robot in a second direction.08-22-2013
20080249663Robot control information generator and robot - A robot control information generator generates control information for operating a robot equipped with a camera and a hand to grasp an object based on a two-dimensional code on the object. The two-dimensional code includes position identifying patterns and an information pattern, the position within the two-dimensional code of each of the position-identifying patterns is specified beforehand, and the information pattern is generated by encoding of information. The robot control information generator comprises an image input unit, a pattern detection unit, a position/posture calculation unit, a decoding device, and a control information-generating unit which generates the control information based on the decoded information decoded by the decoding device and the position/posture information calculated by the position/posture calculation unit.10-09-2008
20120296474ROBOT SYSTEM - A robot system according to an embodiment includes a robot a switching determination unit and a rearrangement instruction unit The switching determination unit performs determination of switching between the operation of transferring the workpiece and the operation of rearranging the workpiece based on the state of transferring the workpiece by the robot The rearrangement instruction unit instructs the robot to rearrange the workpiece.11-22-2012
20080228320Robot - To provide a robot whose degree of freedom of design is not limited, and which has simple structure and further reduces load of an actuator of a neck part, the present invention provides a robot at least including a head part, a body part, and a neck link which connects the head part and the body part, wherein a surrounding object distance measurement means is provided adjacently to the neck link and in an upper portion of the body part between the head part and the body part, and a distance scanning field of the surrounding object distance measurement means is provided in parallel with a horizontal plane.09-18-2008
20090265035Robotic Device Tester - A system, method, and device may include software and hardware which simplify and quicken configuration of the system for testing a device, enhance testing procedures which may be performed, and provide data via which to easily discern a cause and nature of an error which may result during testing. A camera may capture still images of a display screen of a tested device and another camera may capture video images of the tested device and a partner device. A wizard may be used to generate a configuration file based on one previously generated for a similar device. A mount for a tested device may be structured so that: it is suitable for mounting thereon a plurality of differently structured devices; and adjustments in a vertical direction and a horizontal direction in a plane and adjustments of an angle of the device relative to the plane may be easily made.10-22-2009
20120143373AUTOMATED STEERING WHEEL LEVELING SYSTEM AND METHOD - The present invention provides an automated steering wheel leveling system and method. Particularly, the automated steering wheel leveling system includes a machine vision, a plurality of motor cylinders, a motor, and a robot, each operated by a process PC. The machine vision photographs a steering wheel to obtain position information of the steering wheel and determines a stroke of a motor cylinder and a grip position of a gripper using the position information. The plurality of motor cylinders move a plurality of grippers to steering wheel to secure the steering wheel. The motor rotates the steering wheel in order to adjust a zero-point of the steering wheel. The robot then moves the machine vision, the motor cylinder, and the motor to the steering wheel to align a shaft of the servo motor with a shaft of the steering wheel.06-07-2012
20120143375MILKING ROBOT AND METHOD FOR TEAT CUP ATTACHMENT - A milking robot for teat cup attachment includes a robot arm having a gripper for holding at least one teat cup at a time; an image recording device mounted on the robot arm and provided to record at least one image of the teats of a milking animal; and a control device provided to control the robot arm to position the teat cup at a teat of the milking animal based on the at least one image of her teats. The image recording device is, before being provided to record the at least one image of the teats of the milking animal, provided to record at least one image of her hind legs; and the control device is, before being provided to control the robot arm to attach the teat cup to the teat of the milking animal, provided to control the robot arm to move the teat cup between her hind legs, from her rear and towards her udder, based on the at least one image of her hind legs. The milking robot further including a pivoting device for pivoting the image recording device with respect to the gripper of the robot arm between the recording of the at least one image of the hind legs of the milking animal and the recording of the at least one image of her teats.06-07-2012
20080234866MASTER-SLAVE MANIPULATOR SYSTEM - In a master-slave manipulator system, manipulation device can be manipulated intuitively even when clutch manipulation is performed. A master-slave manipulator system includes: mode switching device for switching between a master-slave mode, in which the slave manipulator is controlled, and an observation device visual field tracking clutch mode, in which transmission of an operation command to the slave manipulator from the manipulation device is cut off to move the manipulation device to an optional position and orientation; a switching unit control section that reads a signal of the mode switching device to forward a mode signal to the manipulation device control section; and a visual field transform section that forwards a third control command to the manipulator control section and forwards a fourth control command to the visual field change control section on the basis of an operation command read by the manipulation device control section at the time of the observation device visual field tracking clutch mode so as to make an agreement between a direction of motion of an image of the slave manipulator displayed on the display device and a direction of manipulation of the manipulation device.09-25-2008
20090138123Robotic CBRNE Automated Deployment, Detection, and Reporting System - A system having a plurality of networked detectors for detecting chemical, biological, or radiological, nuclear, or explosive agents is disclosed. The system includes a plurality of remote units, wherein each remote unit includes a robotic base, a lift, a sensor module, transceiver, navigation system, and power source. The remote units communicate with a base station, which receives data from the remotes and determines, based on the data, if an alarm condition exists.05-28-2009
20100004784Apparatus and method for effectively transmitting image through stereo vision processing in intelligent service robot system - A data transmission apparatus of an intelligent robot system and a method thereof are provided. The data transmission apparatus includes a vision processor collector, a communicating unit, and a controller. The vision processor collects images captured through a camera, and performs an image process on the collected image to minimize a quantity of information about unnecessary regions in the collected image. The communicating unit communicates with the robot server, transmits the processed image data from the vision processor to the robot server, and receives corresponding result data from the robot server. The controller controls the image process and the transmission of the processed image data in the vision processor, and a corresponding operation of the robot terminal performed according to result data received from the robot server.01-07-2010
20130218342CONTROL METHOD FOR CLEANING ROBOTS - An embodiment of the invention provides a control method of a cleaning robot. The method includes the steps of: forming a cleaning area according to at least three points which are selected from a light generating device, a charging station or an obstacle; moving the cleaning robot along an outer of the cleaning area from a first position; recording a first cleaning route when the cleaning robot returns back to the first position; moving the cleaning robot to a second position and planning a second cleaning route according to the first cleaning route; and moving the cleaning robot along the second cleaning route.08-22-2013
20090143913IMAGE-BASED SELF-DIAGNOSIS APPARATUS AND METHOD FOR ROBOT - An image-based self-diagnosis apparatus and method for a robot determines the abnormality of a driving unit of a mobile robot by using a camera and reporting to the user in real time. The image-based self-diagnosis method may include: capturing a reference image at the current location and storing the captured reference image; capturing a comparison image after making at least one of linear moves and rotational moves by a preset amount, and storing the captured comparison image; and determining the abnormality of the mobile robot by comparing the stored reference image and the comparison image with each other.06-04-2009
20090024251Method and apparatus for estimating pose of mobile robot using particle filter - A method and apparatus for estimating the pose of a mobile robot using a particle filter is provided. The apparatus includes an odometer which detects a variation in the pose of a mobile robot, a feature-processing module which extracts at least one feature from an upward image captured by the mobile robot, and a particle filter module which determines current poses and weights of a plurality of particles by applying the mobile robot pose variation detected by the odometer and the feature extracted by the feature-processing module to previous poses and weights of the particles.01-22-2009
20090082905METHOD AND APPARATUS FOR TRANSFORMING COORDINATE SYSTEMS IN A TELEMANIPULATION SYSTEM - In a telemanipulation system for manipulating objects located in a workspace at a remote worksite by an operator from an operator's station, such as in a remote surgical system, the remote worksite having a manipulator with an end effector for manipulating an object at the workspace, such as a body cavity, a controller including a hand control at the control operator's station for remote control of the manipulator, an image capture device, such as a camera, and image output device for reproducing a viewable real-time image, the improvement wherein a position sensor associated with the image capture device senses position relative to the end effector and a processor transforms the viewable real-time image into a perspective image with correlated manipulation of the end effector by the hand controller such that the operator can manipulate the end effector and the manipulator as if viewing the workspace in true presence. Image transformation according to the invention includes translation, rotation and perspective correction.03-26-2009
20090198381METHODS FOR REPURPOSING TEMPORAL-SPATIAL INFORMATION COLLECTED BY SERVICE ROBOTS - Robots and methods implemented therein implement an active repurposing of temporal-spatial information. A robot can be configured to analyze the information to improve the effectiveness and efficiency of the primary service function that generated the information originally. A robot can be configured to use the information to create a three dimensional (3D) model of the facility, which can be used for a number of functions such as creating virtual tours of the environment, or porting the environment into video games. A robot can be configured to use the information to recognize and classify objects in the facility so that the ensuing catalog can be used to locate selected objects later, or to provide a global catalog of all items, such as is needed for insurance documentation of facility effects.08-06-2009
20090055024Robotic arm and control system - A robotic arm and control system includes a robotic arm which moves in response to one or more command signals. One or more “active” fiducials are located on the arm, each of which emits its own light. A 3D camera having an associated field-of-view is positioned such that at least one fiducial and a target object to be manipulated are in the FOV. To determine their spatial positions, the arm fiducials are activated and the target object is preferably illuminated with a scanning laser; the camera produces output signals which vary with the spatial locations of the fiducials and target object. A controller receives the output signals and uses the spatial position information as feedback to continuously guide the arm towards the target object. Multiple active fiducials may be employed, each having respective characteristics with which they can be differentiated.02-26-2009
20090143912SYSTEM AND METHOD FOR GRAPHICALLY ALLOCATING ROBOT'S WORKING SPACE - System and method for graphically allocating robot's working space are provided. The system includes an image extractor, a task-allocating server and a robot. A graphic user interface (GUI) of the task-allocating server includes a robot's working scene area, a space attribute allocating area and a robot's task area. Thus, a user assigns one certain space area in the robot's working scene area with a “wall” attribute, or another space area with a “charging station” attribute. Meanwhile, by using the GUI, the user directly assigns the robot to execute a specific task at a certain area. Hence, the user or remote controller facilitates the robot to provide safer and more effective service through his/her environment recognition.06-04-2009
20090204260Door Opener Arrangement for Use with an Industrial Robot - A door opener arrangement for a robot coating device, used for detecting a position of a part (08-13-2009
20090105881Medical Tele-Robotic System - A robotic system that includes a remote controlled robot. The robot may include a camera, a monitor and a holonomic platform all attached to a robot housing. The robot may be controlled by a remote control station that also has a camera and a monitor. The remote control station may be linked to a base station that is wirelessly coupled to the robot. The cameras and monitors allow a care giver at the remote location to monitor and care for a patient through the robot. The holonomic platform allows the robot to move about a home or facility to locate and/or follow a patient.04-23-2009
20090240371Remote presence system mounted to operating room hardware - A robot system that includes a remote station and a robot face. The robot face includes a camera that is coupled to a monitor of the remote station and a monitor that is coupled to a camera of the remote station. The robot face and remote station also have speakers and microphones that are coupled together. The robot face may be coupled to a boom. The boom can extend from the ceiling of a medical facility. Alternatively, the robot face may be attached to a medical table with an attachment mechanism. The robot face and remote station allows medical personnel to provide medical consultation through the system.09-24-2009
20100152897METHOD & APPARATUS FOR CONTROLLING THE ATTITUDE OF A CAMERA ASSOCIATED WITH A ROBOTIC DEVICE - A robot movement control device is connected to a communications network in a remote location relative to a robotic device that is also connected to the communications network. The robot movement control device is an electronic device with a video display for displaying a real-time video image sent to it by a camera associated with the robot. A robot movement control mechanism is included in the robot control device and robot movement control commands are generated by the movement control mechanism which commands include speed and directional information. The control commands are sent by the robot control device over the network to the robot which uses the commands to adjust its speed and direction of movement of the robot. A relationship between the motion of the robot and the attitude of the camera associated with the robot is establish and used in conjunction with the detected motion of the robot to automatically adjust the attitude of the video camera associated with the robot.06-17-2010
20100274390METHOD AND SYSTEM FOR THE HIGH-PRECISION POSITIONING OF AT LEAST ONE OBJECT IN A FINAL LOCATION IN SPACE - The invention relates to a method and a system for the high-precision positioning of at least one object in a final location in space. An object (10-28-2010
20100185327MOVABLE ROBOT - A technique to wholly recognize the surrounding environment may be provided by excluding unknown environment which arises due to parts of a body of a robot hindering the sight of the robot during operations.07-22-2010
20120197439INTERFACING WITH A MOBILE TELEPRESENCE ROBOT - A telepresence robot may include a drive system, a control system, an imaging system, and a mapping module. The mapping module may access a plan view map of an area and tags associated with the area. In various embodiments, each tag may include tag coordinates and tag information, which may include a tag annotation. A tag identification system may identify tags within a predetermined range of the current position and the control system may execute an action based on an identified tag whose tag information comprises a telepresence robot action modifier. The telepresence robot may rotate an upper portion independent from a lower portion. A remote terminal may allow an operator to control the telepresence robot using any combination of control methods, including by selecting a destination in a live video feed, by selecting a destination on a plan view map, or by using a joystick or other peripheral device.08-02-2012
20100185328Robot and control method thereof - Disclosed herein are a robot that supplies a projector service according to a user's context and a controlling method thereof. The robot includes a user detection unit detecting a user; a user recognition unit recognizing the user; an object recognition unit recognizing an object near the user; a position perception unit perceiving relative positions of the object and the user; a context awareness unit perceiving the user's context based on information on the user, the object and the relative positions between the user and the object; and a projector supplying a projector service corresponding to the user's context.07-22-2010
20100262290Data matching apparatus, data matching method and mobile robot - A three-dimensional data matching system is disclosed. Data matching is performed by merging distance information and image information. Therefore, matching accuracy is improved even if a sensor with relatively low sensitivity is used. Matching data generated as a result of matching range data and CAD data is projected onto an image captured by a camera, an effective edge is extracted from the image, and an error of the matching data is corrected based on the effective edge, thereby improving matching accuracy.10-14-2010
20080215185Unmanned ground robotic vehicle having an alternatively extendible and retractable sensing appendage - An unmanned robotic vehicle is capable of sensing an environment at a location remote from the immediate area of the vehicle frame. The unmanned robotic vehicle includes a retractable appendage with a sensing element. The sensing element can include a camera, chemical sensor, optical sensor, force sensor, or the like.09-04-2008
20100138042ROBOT SYSTEM - Teaching images are acquired at a plurality of separate teaching points on a running route extending from a running start position to a goal position, respectively, under a first light environmental condition and a light environmental condition different from the first light environmental condition, and the teaching images are stored. A present teaching image serving as a target for a robot body in a running direction at present is selected from the stored teaching images. A driving mechanism is controlled so as to increase the matching degree between the present teaching image and an actual image taken by a camera.06-03-2010
20130218341CONTROL METHOD FOR CLEANING ROBOTS - An embodiment of the invention provides a control method of a cleaning robot with a non-omnidirectional light detector. The method includes the steps of: detecting a light beam via the non-omnidirectional light detector; stopping the cleaning robot and spinning the non-omnidirectional light detector when the non-omnidirectional light detector detects the light beam; stopping the spinning of the non-omnidirectional light detector and estimating a first spin angle when the non-omnidirectional light detector does not detect the light beam; and adjusting a moving direction of the cleaning robot according to the first spin angle.08-22-2013
20130218343CONTROL METHOD FOR CLEANING ROBOTS - An embodiment of the invention provides a control method for a cleaning robot with a quasi-omnidirectional detector and a directional light detector. The method includes: rotating the non-omnidirectional light detector when the non-omnidirectional light detector detects a light beam; when the non-omnidirectional light detector does not detect the light beam, the non-omnidirectional light detector is stopped from being spun and a rotation angle is estimated; determining a rotation direction according to the rotation angle; rotating the cleaning robot according to the rotation direction; stopping the rotation of the cleaning robot when the directional light detector detects the light beam.08-22-2013
20090281662Simulator for visual inspection apparatus - A simulator for a visual inspection apparatus is provided. The apparatus is equipped with a robot having an arm and a camera attached to a tip end of the arm, the camera inspecting a point being inspected of a workpiece. Using 3D profile data of a workpiece, information of lenses of cameras, operational data of a robot, simulation for imaging is made for a plurality of points being inspected of the workpiece. For allowing the camera to image the points being inspected of the workpiece, a position and an attitude of the tip end of the arm of the robot are obtained. Based on the obtained position and attitude, it is determined whether or not the imaging is possible. When the imaging is possible, installation-allowed positions of the robot are decided and outputted as candidates of positions for actually installing the robot.11-12-2009
20120143374ROBOT ACTION BASED ON HUMAN DEMONSTRATION - Embodiments of the invention provide an approach for reproducing a human action with a robot. The approach includes receiving data representing motions and contact forces of the human as the human performs the action. The approach further includes approximating, based on the motions and contact forces data, the center of mass (CoM) trajectory of the human in performing the action. Finally, the approach includes generating a planned robot action for emulating the designated action by solving an inverse kinematics problem having the approximated human CoM trajectory as a hard constraint and the motion capture data as a soft constraint.06-07-2012
20100305756Image taking system and electronic-circuit-component mounting machine - An image taking system including: (a) a lighting device capable of changing a light emission time to various time length values; (b) an image taking device configured to take an image of a subject portion while light is being emitted by the lighting device; (c) a subject-portion moving device configured to move the subject portion relative to the image taking device, and capable of changing a movement velocity of the subject portion relative to the image taking device, to various velocity values; and (d) a control device configured, during movement of the subject portion by the subject-portion moving device, to cause the lighting device to emit the light for one of the time length values as the light emission time and to cause the image taking device to take the image, and is configured to control the movement velocity, such that an amount of the movement of the subject portion for the above-described one of the time length values is not larger than a predetermined movement amount.12-02-2010
20090240372EXTERNAL SYSTEM FOR ROBOTIC ACCURACY ENHANCEMENT - The inventive concept of the metrology system (the system) actively determines the 6 Degree of Freedom (6-DOF) pose of a motion device such as, but not limited to, an industrial robot employing an end of arm tool (EOAT). A concept of the system includes using laser pointing devices without any inherent ranging capability in conjunction with the EOAT-mounted targets to actively determine the pose of the EOAT at distinct work positions of at least one motion device.09-24-2009
20110245975MILKING APPARATUS AND PROCESS - The present invention relates to milking apparatus. The milking apparatus comprises sensor apparatus (10-06-2011
20110046785METHOD AND DEVICE FOR THE REMOVAL OF A LEAF FROM A CROP - Method and device for the removal of a part of a crop, such as a leaf (02-24-2011
20110245974ROBOT DEVICE, METHOD OF CONTROLLING ROBOT DEVICE, AND PROGRAM - There is provided a robot device including an instruction acquisition unit that acquires an order for encouraging a robot device to establish joint attention on a target from a user, a position/posture estimation unit that estimates a position and posture of an optical indication device, which is operated by the user to indicate the target by irradiation of a beam, in response to acquisition of the order, and a target specifying unit that specifies a direction of the target indicated by irradiation of the beam based on an estimation result of the position and posture and specifies the target on an environment map representing a surrounding environment based on a specifying result of the direction.10-06-2011
20110245973PROTOCOL FOR A REMOTELY CONTROLLED VIDEOCONFERENCING ROBOT - A robotic system that includes a robot and a remote station. The remote station can generate control commands that are transmitted to the robot through a broadband network. The control commands can be interpreted by the robot to induce action such as robot movement or focusing a robot camera. The robot can generate reporting commands that are transmitted to the remote station through the broadband network. The reporting commands can provide positional feedback or system reports on the robot.10-06-2011
20090312871SYSTEM AND METHOD FOR CALCULATING LOCATION USING A COMBINATION OF ODOMETRY AND LANDMARKS - Disclosed is a system and method for calculation a location in a real-time manner using a combination of odometry and artificial landmarks. The system for calculating a location comprising a landmark detection unit detecting an image coordinates value of the artificial landmark corresponding to a location in a two-dimensional image coordinate system with respect to a mobile robot from an image obtained by photographing a specific space where the artificial landmarks are provided; a landmark identification unit comparing a predicted image value of the artificial landmark, obtained by converting a location coordinates value of the artificial landmark into an image coordinates value corresponding to the location in the two-dimensional image coordinate system with respect to a location coordinates value corresponding to a location in an actual three-dimensional spatial coordinate system of the mobile robot, with an image coordinates value detected by the landmark detection unit to detect the location coordinates value of the artificial landmark; a first location calculation unit calculating a current location coordinates value of the mobile robot using a predetermined location calculation algorithm based on the image coordinates value detected by the landmark detection unit and the location coordinates value detected by the landmark identification unit; a second location calculation unit calculating a current location coordinates value of the mobile robot using a predetermined location calculation algorithm based on odometry information of the mobile robot; and a main control unit updating the current location coordinates value of the mobile robot, using the location coordinates value calculated by the first location calculation unit when the location coordinates value calculated by the first location calculation unit exists, or using the location coordinate value obtained from the second location calculation unit when the location coordinates value calculated by the first location calculation unit does not exist.12-17-2009
20110082586HANDLING APPARATUS, CONTROL DEVICE, CONTROL METHOD, AND PROGRAM - A handling apparatus having a belt conveyor (04-07-2011
20100185329VISION AIDED CASE/BULK PALLETIZER SYSTEM - The vision aided case/bulk palletizer system of this invention is a process and apparatus for: providing a camera positioned over the dunnage supply line; initiating a frame grab of the dunnage supply line with the camera; using the frame grab to determine the position of the dunnage; using the frame grab to position the programmable robot over the dunnage; feeding the dunnage from the dunnage supply line to the load building area; and controlling the steps with the single programmable robot, microprocessor and software. This system provides for transfer of the dunnage when the position of the dunnage is skewed by using the frame grab to position the programmable robot over the skewed dunnage. In another embodiment, the camera is used to determine any void in the tier of product during the build of a tier of product, and also provides for error-proofing the transfer of dunnage.07-22-2010
20100131103SERVER CONNECTIVITY CONTROL FOR TELE-PRESENCE ROBOT - A robot system with a robot that has a camera and a remote control station that can connect to the robot. The connection can include a plurality of privileges. The system further includes a server that controls which privileges are provided to the remote control station. The privileges may include the ability to control the robot, joint in a multi-cast session and the reception of audio/video from the robot. The privileges can be established and edited through a manager control station. The server may contain a database that defines groups of remote control station that can be connected to groups of robots. The database can be edited to vary the stations and robots within a group. The system may also allow for connectivity between a remote control station at a user programmable time window.05-27-2010
20100131102SERVER CONNECTIVITY CONTROL FOR TELE-PRESENCE ROBOT - A robot system with a robot that has a camera and a remote control station that can connect to the robot. The connection can include a plurality of privileges. The system further includes a server that controls which privileges are provided to the remote control station. The privileges may include the ability to control the robot, joint in a multi-cast session and the reception of audio/video from the robot. The privileges can be established and edited through a manager control station. The server may contain a database that defines groups of remote control station that can be connected to groups of robots. The database can be edited to vary the stations and robots within a group. The system may also allow for connectivity between a remote control station at a user programmable time window.05-27-2010
20100070078Apparatus and method for building map - An apparatus and method for building a map are provided. According to the apparatus and method, a path is generated on the basis of the degrees of uncertainty of features extracted from an image obtained while a mobile robot explores unknown surroundings, and the mobile robot travels along the generated path. The path based on the degrees of uncertainty of the features is generated and this may increase the accuracy of a feature map of the mobile robot or accuracy in self localization.03-18-2010
20110082585METHOD AND APPARATUS FOR SIMULTANEOUS LOCALIZATION AND MAPPING OF MOBILE ROBOT ENVIRONMENT - Techniques that optimize performance of simultaneous localization and mapping (SLAM) processes for mobile devices, typically a mobile robot. In one embodiment, erroneous particles are introduced to the particle filtering process of localization. Monitoring the weights of the erroneous particles relative to the particles maintained for SLAM provides a verification that the robot is localized and detection that it is no longer localized. In another embodiment, cell-based grid mapping of a mobile robot's environment also monitors cells for changes in their probability of occupancy. Cells with a changing occupancy probability are marked as dynamic and updating of such cells to the map is suspended or modified until their individual occupancy probabilities have stabilized. In another embodiment, mapping is suspended when it is determined that the device is acquiring data regarding its physical environment in such a way that use of the data for mapping will incorporate distortions into the map, as for example when the robotic device is tilted.04-07-2011
20110098859ROBOT SYSTEM AND WORKPIECE PICKING METHOD - A robot system includes a robot. A robot control device is configured to control an operation of the robot, and includes a workpiece shape memory configure to store a shape of workpieces. A shape sensor is configured to detect shape information about the workpieces. A target workpiece detector is configured to detect a graspable workpiece based on the shape information detected by the shape sensor. A grasping information memory is configured to store a grasping position indicating which portion of the graspable workpiece is to be grasped by the robot. A grasping operation controller is configured to control the robot to grasp the graspable workpiece detected by the target workpiece detector and to pick the grasped workpiece. A disturbing operation controller is configured to control, if no graspable workpiece is detected by the target workpiece detector, the robot to perform a workpiece disturbing operation.04-28-2011
20110071679EMBEDDED DIAGNOSTIC, PROGNOSTIC, AND HEALTH MANAGEMENT SYSTEM AND METHOD FOR A HUMANOID ROBOT - A robotic system includes a humanoid robot with multiple compliant joints, each moveable using one or more of the actuators, and having sensors for measuring control and feedback data. A distributed controller controls the joints and other integrated system components over multiple high-speed communication networks. Diagnostic, prognostic, and health management (DPHM) modules are embedded within the robot at the various control levels. Each DPHM module measures, controls, and records DPHM data for the respective control level/connected device in a location that is accessible over the networks or via an external device. A method of controlling the robot includes embedding a plurality of the DPHM modules within multiple control levels of the distributed controller, using the DPHM modules to measure DPHM data within each of the control levels, and recording the DPHM data in a location that is accessible over at least one of the high-speed communication networks.03-24-2011
20120303160COMPANION ROBOT FOR PERSONAL INTERACTION - A mobile robot guest for interacting with a human resident performs a room-traversing search procedure prior to interacting with the resident, and may verbally query whether the resident being sought is present. Upon finding the resident, the mobile robot may facilitate a teleconferencing session with a remote third party, or interact with the resident in a number of ways. For example, the robot may carry on a dialogue with the resident, reinforce compliance with medication or other schedules, etc. In addition, the robot incorporates safety features for preventing collisions with the resident; and the robot may audibly announce and/or visibly indicate its presence in order to avoid becoming a dangerous obstacle. Furthermore, the mobile robot behaves in accordance with an integral privacy policy, such that any sensor recording or transmission must be approved by the resident.11-29-2012
20110046784ASYMMETRIC STEREO VISION SYSTEM - The different illustrative embodiments provide an apparatus that includes an autonomous vehicle, a modular navigation system, and an asymmetric vision module. The modular navigation system is coupled to the autonomous vehicle. The asymmetric vision module is configured to interact with the modular navigation system.02-24-2011
20110040409ROBOTIC VEHICLE WITH DRIVE MEANS AND METHOD FOR ACTIVATING DRIVE MEANS - The invention relates to a robotic vehicle (02-17-2011
20120277914Autonomous and Semi-Autonomous Modes for Robotic Capture of Images and Videos - The subject disclosure is directed towards a set of autonomous and semi-autonomous modes for a robot by which the robot captures content (e.g., still images and video) from a location such as a house. The robot may produce a summarized presentation of the content (a “botcast”) that is appropriate for a specific scenario, such as an event, according to a specified style. Modes include an event mode where the robot may interact with and simulate event participants to provide desired content for capture. A patrol mode operates the robot to move among locations (e.g., different rooms) to capture a panorama (e.g., 360 degrees) of images that can be remotely viewed.11-01-2012
20120277913Vision System for Robotic Attacher - In certain embodiments, a system includes a controller operable to access a first image generated by a first camera. The controller determines a reference point from at least one main feature of a dairy livestock included in the first image. The controller is further operable to access a second image generated by the second camera. The second image includes at least a portion of an udder of the dairy livestock. The controller determines a location of a teat of the dairy livestock based on the second image.11-01-2012
20100114374Apparatus and method for extracting feature information of object and apparatus and method for creating feature map - Technology for creating a feature map for localizing a mobile robot and extracting feature information of surroundings is provided. According to one aspect, feature information including a reflection function is extracted from information acquired using a 3D distance sensor and used as a basis for creating a feature map. Thus, a feature map that is less sensitive to change in the surrounding environment can be created, and a success rate of feature matching can be increased.05-06-2010
20100070079MOBILE VIDEOCONFERENCING ROBOT SYSTEM WITH NETWORK ADAPTIVE DRIVING - A remote control station that controls a robot through a network. The remote control station transmits a robot control command that includes information to move the robot. The remote control station monitors at least one network parameter and scales the robot control command as a function of the network parameter. For example, the remote control station can monitor network latency and scale the robot control command to slow down the robot with an increase in the latency of the network. Such an approach can reduce the amount of overshoot or overcorrection by a user driving the robot.03-18-2010
20090018699WORK POSITIONING DEVICE - Regarding predetermined positioning criteria (M01-15-2009
20090055023Telepresence robot with a printer - A remote controlled robot system that includes a robot and a remote controlled station. The robot includes a camera and a printer coupled to a mobile platform. The remote control station may display one or more graphical user interfaces with data fields. The graphical user interfaces allow a user to enter information into the data fields. The information is then transmitted to the robot and printed by the robot printer. The information may include a medical prescription and the name of the patient. Providing a robot printer allows the user to directly provide a medical prescription while remotely observing and interacting with the patient.02-26-2009
20080215184Method for searching target object and following motion thereof through stereo vision processing and home intelligent service robot using the same - A home intelligent service robot for recognizing a user and following the motion of a user and a method thereof are provided. The home intelligent service robot includes a driver, a vision processor, and a robot controller. The driver moves an intelligent service robot according to an input moving instruction. The vision processor captures images through at least two or more cameras in response to a capturing instruction for following a target object, minimizes the information amount of the captured image, and discriminates objects in the image into the target object and obstacles. The robot controller provides the capturing instruction for following the target object in a direction of collecting instruction information to the vision processor when the instruction information is collected from outside, and controls the intelligent service robot to follow and move the target object while avoiding obstacles based on the discriminating information from the vision processor.09-04-2008
20130158709ROBOT CONTROL DURING AN E-STOP EVENT - A system for a work cell having a carrier that moves a product along an assembly line includes an assembly robot, sensor, and controller. An arm of the robot moves on the platform adjacent to the carrier. The sensor measures a changing position of the carrier and encodes the changing position as a position signal. The controller receives the position signal and calculates a lag value of the robot with respect to the carrier using the position signal. The controller detects a requested e-stop of the carrier when the arm and product are in mutual contact, and selectively transmits a speed signal to the robot to cause a calibrated deceleration of the platform before executing the e-stop event. This occurs only when the calculated tracking position lag value is above a calibrated threshold. A method is also disclosed for using the above system in the work cell.06-20-2013
20110054691METHOD AND APPARATUS FOR BIRDS CONTROL USING MOBILE ROBOT - Provides is a method including receiving information on a surrounding situation detected by the mobile robot; detecting birds from the received surrounding situation information; allocating a birds control mission to the mobile robot by extracting a birds control pattern corresponding to the surrounding situation; and verifying a result in accordance with performing the allocated birds control mission from the mobile robot. By controlling the birds so as to, in advance, prevent a loss of lives and economical loss which may be caused when the birds collide with airplanes at the airport, it is possible to improve productivity and efficiency of a birds repelling job in an airport and provide construction of a new type of aviation maintenance business model by activating an air traffic control industry through providing a safer airplane operating model while saving operating personnel costs for preventing collision of birds.03-03-2011
20100292841ROBOT WITH 3D GRASPING CAPABILITY - A robotic harvester has a mobile platform. A programmable multi-axis robot arm is connected to the platform. The robot arm is mounted to a computer controller. A stereovision camera connected to the computer is mounted on the mobile platform. The camera views the area under the mobile platform and identifies objects in geometric coordinates. The robot arm is directed to the location of the object and a gripper on the robot arm grasps the object. The stem is separated from the object and the object is deposited on a sorting conveyor. The harvester is incrementally moved. A method of harvesting is disclosed.11-18-2010
20110137463SYSTEMS AND METHODS ASSOCIATED WITH HANDLING AN OBJECT WITH A GRIPPER - A system associated with handling an object with a gripper includes a sensor that is configured to measure spatially distributed data that represents the position of the object that is handled by the gripper. The system further includes a computing unit that is configured to determine the behavior of the object.06-09-2011
20100324737SHAPE DETECTION SYSTEM - A shape detection system includes a distance image sensor that detects an image of a plurality of detection objects and distances to the detection objects, the detection objects being randomly arranged in a container, a sensor controller that detects a position and an orientation of each of the detection objects in the container on the basis of the result of the detection performed by the distance image sensor and a preset algorithm, and a user controller that selects the algorithm to be used by the sensor controller and sets the algorithm for the sensor controller.12-23-2010
20100324736Robot cleaner, docking station, robot cleaner system including robot cleaner and docking station, and method of controlling robot cleaner - A robot cleaner system is described including a docking station to form a docking area within a predetermined angle range of a front side thereof, to form docking guide areas which do not overlap each other on the left and right sides of the docking area, and to transmit a docking guide signal such that the docking guide areas are distinguished as a first docking guide area and a second docking guide area according to an arrival distance of the docking guide signal. The robot cleaner system also includes a robot cleaner to move to the docking area along a boundary between the first docking guide area and the second docking guide area when the docking guide signal is sensed and to move along the docking area so as to perform docking when reaching the docking area.12-23-2010
20090210092Method for self-localization of robot based on object recognition and environment information around recognized object - A method for self-localization of a robot, the robot including a camera unit, a database storing a map around a robot traveling path, and a position arithmetic unit estimating the position of the robot, includes: acquiring an image around the robot, in the camera unit. Further, the method includes recognizing, in the position arithmetic unit, an individual object in the image acquired by the camera unit, to generate position values on a camera coordinate system of local feature points of the individual objects and local feature points of a surrounding environment including the individual objects; and estimating, in the position arithmetic unit, the position of the robot on the basis of the map and the position values on the camera coordinate system of local feature points of the individual objects and local feature points of a surrounding environment including the individual objects.08-20-2009
20110153082SENSOR SYSTEM FOR DETECTING THE SURFACE STRUCTURES OF SEVERAL PACKAGED ARTICLES - An exemplary embodiment of the invention relates to a sensor system for detecting the surface structures of several packaged articles. An exemplary system comprises at least one laser distance detector that functions according to a triangulation principle and that determines the distance between the laser distance detector and a surface structure of a packaged article. The laser distance detector has at least one analog output via which a distance-proportional analog signal can be emitted. The analog output of at least one laser distance detector is in communication with an evaluation unit via an amplifier circuit. The amplifier circuit encompasses at least one operational amplifier that has two inputs, and the analog signal of the laser distance detector is present at a first input of the at least one operational amplifier. A variable reference voltage is present at the other input of the at least one operational amplifier. The reference voltage may be obtained from the analog signal of the analog output, and this analog signal may be present at the other input of the operational amplifier via a low-pass filter. The output of the at least one operational amplifier may be connected to the evaluation unit, as a result of which the amplifier circuit is configured in such a way that abrupt changes in the analog signal bring about a change in the output signal of the at least one operational amplifier. More gradual changes in the analog signal do not bring about a substantial change in the output signal of the at least one operational amplifier. The evaluation unit may evaluate the output signal of the at least one operational amplifier.06-23-2011
20100191376NETWORK ARCHITECTURE FOR REMOTE ROBOT WITH INTERCHANGEABLE TOOLS - Systems, methods and devices for the remote control of a robot which incorporates interchangeable tool heads. Although applicable to many different industries, the core structure of the system includes a robot with a tool head interface for mechanically, electrically and operatively interconnecting a plurality of interchangeable tool heads to perform various work functions. The robot and tool head may include several levels of digital feedback (local, remote and wide area) depending on the application. The systems include a single umbilical cord to send power, air, and communications signals between the robot and a remote computer. Additionally, all communication (including video) is preferably sent in a digital format. Finally, a GUI running on the remote computer automatically queries and identifies all of the various devices on the network and automatically configures its user options to parallel the installed devices. Systems according to the preferred embodiments find particular application in the pipeline arts. For example, interchangeable tool heads may be designed to facilitate inspection, debris clearing, cleaning, relining, lateral cutting after relining, mapping, and various other common pipeline-related tasks.07-29-2010
20100191375DOCUMENTATION THROUGH A REMOTE PRESENCE ROBOT - A robotic system that is used in a tele-presence session. For example, the system can be used by medical personnel to examine, diagnose and prescribe medical treatment in the session. The system includes a robot that has a camera and is controlled by a remote station. The system further includes a storage device that stores session content data regarding the session. The data may include a video/audio taping of the session by the robot. The session content data may also include time stamps that allow a user to determine the times that events occurred during the session. The session content data may be stored on a server that accessible by multiple users. Billing information may be automatically generated using the session content data.07-29-2010
20110218674REMOTE PRESENCE SYSTEM INCLUDING A CART THAT SUPPORTS A ROBOT FACE AND AN OVERHEAD CAMERA - A tele-presence system that includes a cart. The cart includes a robot face that has a robot monitor, a robot camera, a robot speaker, a robot microphone, and an overhead camera. The system also includes a remote station that is coupled to the robot face and the overhead camera. The remote station includes a station monitor, a station camera, a station speaker and a station microphone. The remote station can display video images captured by the robot camera and/or overhead camera. By way of example, the cart can be used in an operating room, wherein the overhead camera can be placed in a sterile field and the robot face can be used in a non-sterile field. The user at the remote station can conduct a teleconference through the robot face and also obtain a view of a medical procedure through the overhead camera.09-08-2011
20100332033CONTROL OF MEDICAL ROBOTIC SYSTEM MANIPULATOR ABOUT KINEMATIC SINGULARITIES - A medical robotic system includes an entry guide with articulatable instruments extending out of its distal end, an entry guide manipulator providing controllable four degrees-of-freedom movement of the entry guide relative to a remote center, and a controller configured to manage operation of the entry guide manipulator in response to operator manipulation of one or more input devices. As the entry guide manipulator approaches a yaw/roll singularity, the controller modifies its operation to allow continued movement of the entry guide manipulator without commanding excessive joint velocities while maintaining proper orientation of the entry guide.12-30-2010
20080300723Teaching position correcting device - A teaching position correcting device which can easily correct, with high precision, teaching positions after shifting at least one of a robot and an object worked by the robot. Calibration is carried out using a vision sensor (i.e., CCD camera) that is mounted on a work tool. The vision sensor measures three-dimensional positions of at least three reference marks not aligned in a straight line on the object. The vision sensor is optionally detached from the work tool, and at least one of the robot and the object is shifted. After the shifting, calibration (this can be omitted when the vision sensor is not detached) and measuring of three-dimensional positions of the reference marks are carried out gain. A change in a relative positional relationship between the robot and the object is obtained using the result of measuring three-dimensional positions of the reference marks before and after the shifting respectively. To compensate for this change, the teaching position data that is valid before the shifting is corrected. The robot can have a measuring robot mechanical unit having a vision sensor, and a separate working robot mechanical unit that works the object. In this case, positions of the working robot mechanical unit before and after the shifting, respectively, are also measured.12-04-2008
20120209433SOCIAL ROBOT - Social robot formed by an artificial vision system composed of webcam cameras, a voice recognition system formed by three microphones arranged in a triangular configuration, an expression system composed of an LED matrix, formed by a plurality of LEDs and a status LED, and eyelids formed by half-moons connected to gearwheels which engage with respective servomotors via transmission wheels, a speech synthesis system composed of loudspeakers, a system for detecting obstacles which is formed by ultrasound sensors, and a movement system formed by two driving wheels.08-16-2012
20120209431ROBOTIC BASED HEALTH CARE SYSTEM - A robotic system that can be used to treat a patient. The robotic system includes a mobile robot that has a camera. The mobile robot is controlled by a remote station that has a monitor. A physician can use the remote station to move the mobile robot into view of a patient. An image of the patient is transmitted from the robot camera to the remote station monitor. A medical personnel at the robot site can enter patient information into the system through a user interface. The patient information can be stored in a server. The physician can access the information from the remote station. The remote station may provide graphical user interfaces that display the patient information and provide both a medical tool and a patient management plan.08-16-2012
20100017035ASSEMBLY OF A MILKING ROBOT WITH A MILKING ROBOT FEEDING PLACE, AND A DEVICE FOR GRIPPING AND DISPLACING MATERIAL - The invention provides a device for gripping and displacing material, provided with a gripper, comprising controller for the gripper and a sensor for forming an image of an observation area, which sensor is connected to the controller, wherein the sensor comprises a source of radiation for modulated electromagnetic radiation, a receiver device for radiation reflected by an object, comprising a matrix of receivers, an optical device for displaying the reflected radiation on the receiver device, and sensor image processor in order to determine for each receiver a phase difference between the electromagnetic radiation emitted and the electromagnetic radiation reflected in order to calculate a distance from the receiver to the object. A device equipped with such a sensor is capable of functioning in a very reliable, safe and multifunctional manner, because it is capable of processing spatial images during operation. The invention also provides an assembly of the device and a feeding place, in particular of a milking robot and a milking robot feeding place.01-21-2010
20120209432HYBRID CONTROL DEVICE - A brain-based device (BBD) for moving in a real-world environment has sensors that provide data about the environment, actuators to move the BBD, and a hybrid controller which includes a neural controller having a simulated nervous system being a model of selected areas of the human brain and a non-neural controller based on a computational algorithmic network. The neural controller and non-neural controller interact with one another to control movement of the BBD.08-16-2012
20120209430POSITION DETECTION DEVICE FOR ROBOT, ROBOTIC SYSTEM, AND POSITION DETECTION METHOD FOR ROBOT - A position detection device for a horizontal articulated robot includes a camera for imaging the robot or a work as an imaging object, a control section for calculating a location of the imaging object from an image, an acquisition section (I/O) for obtaining the drive amounts of first and second electric motors of the robot, and a storage section for storing the calculated location of the imaging object and the drive amounts so as to correspond to each other. A common trigger signal for detecting the location of the imaging object is input to the camera and the I/O. The camera starts to image the imaging object in response to the input of the trigger signal. The I/O starts to obtain the drive amounts in response to the input of the trigger signal.08-16-2012
20120065780ROBOT - A robot includes a gripping section and a main body section to which the pair of finger sections are attached, having one end sections of the pair of finger sections rotatably connected to each other around a first rotating shaft disposed at a position separate from the main body section, and adapted to open and close the pair of finger sections by swinging the other side of the pair of finger sections on a plane parallel to a mounting surface on which an object is mounted centered on the first rotating shaft to thereby grip the object, a moving device adapted to relatively move the object and the gripping section, and a control device adapted to control the moving device to move the gripping section relatively toward the object, and grip the object with the gripping section at at least three contact points.03-15-2012
20120016522PROCESS AND MACHINE FOR IDENTIFICATION AND WORKING OF DEFECTS ON USED TYRES - An automatic process for the identification and working of defects (01-19-2012
20120059517OBJECT GRIPPING SYSTEM, OBJECT GRIPPING METHOD, STORAGE MEDIUM AND ROBOT SYSTEM - A system comprises: a measurement unit adapted to measure a position/orientation of at least one target object based on an image obtained by capturing the at least one target object; a selection unit adapted to select at least one grippable target object based on the position/orientation; a determination unit adapted to determine, as an object to be gripped, a grippable target object in a state with a highest priority from the at least one grippable target object based on priorities set in advance for states including gripping positions/directions; a gripping unit adapted to grip the object to be gripped in the state with the highest priority; and a changing unit adapted to change the state of the gripped object, to a state in which the gripped object is assembled to the other object.03-08-2012
20120158180OBJECT GRIPPING APPARATUS, METHOD OF CONTROLLING THE SAME AND STORAGE MEDIUM - An object gripping apparatus includes an image capturing unit for capturing a region including a plurality of works, an obtaining unit for obtaining distance information of the region, a measurement unit for measuring three-dimensional positions/orientations of a plurality of gripping-candidate works out of the plurality of works based on the image and distance information, thereby generating three-dimensional position/orientation information, a selection unit for selecting a gripping-target work based on the three-dimensional position/orientation information, a gripping unit for gripping the gripping-target work, and an updating unit for updating the three-dimensional position/orientation information by measuring three-dimensional positions/orientations of the gripping-candidate works at a time interval during gripping of the gripping-target work. When the gripping ends, the next gripping-target work is selected based on the updated three-dimensional position/orientation information of the gripping-candidate works.06-21-2012
20120158179ROBOT CONTROL APPARATUS - According to an embodiment, a target trajectory that takes into account the hardware constraints of a robot is generated, based on results obtained by calculating, temporally interpolating, and estimating image feature amounts from a captured image.06-21-2012
20120065779ROBOT - A robot includes a gripping section adapted to grip an object by open and close a pair of finger sections, a moving device adapted to relatively move the object and the gripping section, and a control device adapted to control the moving device to move the gripping section relatively toward the object, and dispose the pair of finger sections in a periphery of the object, and then control the gripping section to open and close the pair of finger sections in a plane parallel to a mounting surface on which the object is mounted, pinch the object between the pair of finger sections from a lateral side of the object, and grip the object with the gripping section at least three contact points.03-15-2012
20120209429ROBOT APPARATUS, POSITION DETECTING DEVICE, POSITION DETECTING PROGRAM, AND POSITION DETECTING METHOD - A robot apparatus includes: an image pickup device; a goal-image storing unit that stores, according to sensitivity represented by an amount of change of a pixel value at the time when a target aligned with a goal position on an image at a pixel level is displaced by a displacement amount at a sub-pixel level, goal image data in a state in which the target is arranged; and a target detecting unit that calculates a coincident evaluation value of the target on the basis of comparison of image data including the target and the goal image data stored by the goal-image storing unit and detects positional deviation of the target with respect to the goal position on the basis of the coincidence evaluation value.08-16-2012
20120072024TELEROBOTIC SYSTEM WITH DUAL APPLICATION SCREEN PRESENTATION - A robot system that includes a robot and a remote station. The remote station may be a personal computer coupled to the robot through a broadband network. A user at the remote station may receive both video and audio from a camera and a microphone of the robot, respectively. The remote station may include a visual display that displays both a first screen field and a second screen field. The first screen field may display a video image provided by a robot camera. The second screen field may display information such as patient records. The information from the second screen field may be moved to the first screen field and also transmitted to the robot for display by a robot monitor. The user at the remote station may annotate the information displayed by the robot monitor to provide a more active video-conferencing experience.03-22-2012
20120072023Human-Robot Interface Apparatuses and Methods of Controlling Robots - A method of controlling a robot using a human-robot interface apparatus in two-way wireless communication with the robot includes displaying on a display interface a two-dimensional image, an object recognition support tool library, and an action support tool library. The method further includes receiving a selected object image representing a target object, comparing the selected object image with a plurality of registered object shape patterns, and automatically recognizing a registered object shape pattern associated with the target object if the target object is registered with the human-robot interface. The registered object shape pattern may be displayed on the display interface, and a selected object manipulation pattern selected from the action support tool library may be received. Control signals may be transmitted to the robot from the human-robot interface. Embodiments may also include human-robot apparatuses (HRI) programmed to remotely control a robot.03-22-2012
20120109376CLEANER AND CONTROLLING METHOD OF THE SAME - Disclosed are a robot cleaner and a method for controlling the same. The robot cleaner may prevent repeated executions of a cleaning operation by recognizing its position through an absolute position recognition unit, in a case that the cleaning operation is performed again after being forcibly stopped due to arbitrary causes. And, the robot cleaner may solve a position recognition error by a relative position recognition unit with using an image detection unit, and may effectively perform a cleaning operation based on a similarity between an image detected by the image detection unit and an image with respect to a cleaning region. This may improve the system efficiency and stability, and enhance a user's convenience.05-03-2012
20110106313BRIDGE INSPECTION ROBOT CAPABLE OF CLIMBING OBSTACLE - Provided is a bridge inspection robot which is capable of climbing over an obstacle, the bridge inspection robot including: a climbing-over portion (05-05-2011
20110106312System and Method For Multiple View Machine Vision Target Location - A machine vision system for controlling the alignment of an arm in a robotic handling system. The machine vision system includes an optical imager aligned to simultaneously capture an image that contains a view of the side of an object, such as a test tube, along with a view of the top of the object provided by a mirror appropriately positioned on the robotic arm. The machine vision system further includes a microcontroller or similar device for interpreting both portions of the image. For example, the microcontroller may be programmed to determine the location of the object in the reflected portion of the image and transpose that information into the location of the object relative to the robotic arm. The microcontroller may also be programmed to decode information positioned on the object by interpreting visual information contained in the other portion of the captured image.05-05-2011
20120232697ROBOT CLEANER AND CONTROLLING METHOD THEREOF - Disclosed are a robot cleaner capable of performing a cleaning operation by selecting a cleaning algorithm suitable for the peripheral circumstances based on an analysis result of captured image information, and a controlling method thereof. The robot cleaner comprises an image sensor unit configured to capture image information when an operation instructing command is received, and a controller configured to analyze the image information captured by the image sensor unit, and configured to control a cleaning operation based on a first cleaning algorithm selected from a plurality of pre-stored cleaning algorithms based on a result of the analysis.09-13-2012
20120221145MASTER INPUT DEVICE AND MASTER-SLAVE MANIPULATOR - A master input device operates a slave manipulator which includes joints corresponding to a plurality of degrees of freedom. The device includes an operating unit and detection units of two or more systems. The operating unit is capable of being changed in position and orientation by an operator's operation. The operating unit is provided command values of a position and orientation of the slave manipulator as the position and orientation thereof change. The detection units individually detect different physical quantities related to the operating unit in order to detect the position and orientation of the operating unit.08-30-2012
20120165986ROBOTIC PICKING OF PARTS FROM A PARTS HOLDING BIN - A robot system (06-28-2012
20120165984MOBILE ROBOT APPARATUS, DOOR CONTROL APPARATUS, AND DOOR OPENING AND CLOSING METHOD THEREFOR - A mobile robot apparatus includes a video recognition unit for recognizing a position of an opening button mounted around a door through video analysis after acquiring peripheral video information. Further, the mobile robot apparatus includes a mobile controller for performing an operation on the opening button at the position recognized by the video recognition unit to generate an opening selection signal, thereby allowing a door control apparatus to open the door according to the generated opening selection signal.06-28-2012
20120215358ROBOTIC ARM SYSTEM - A robotic arm for use with a robotic system and methods for making and using the same are described. The arm can have multiple joints and can have one or more articulating end effectors. The arm and end effectors can have safety releases to prevent over-rotation. The arm can have individual cooling.08-23-2012
20110184558Robot And Method For Controlling A Robot - The invention relates to a robot (R, 07-28-2011
20100179691Robotic Platform - The present invention is a robotic mobile platform vehicle that can be thrown into hostile or hazardous environments for gathering information and transmitting that information to a remotely located control station and a system comprising the robotic mobile platform. The system of the invention is adapted to provide it's operator with significant information without being exposed directly to actual or potential danger. One of the key features of the invention is that at least four imaging assemblies are mounted on the robotic platform and that the system has the processing ability to stitch the views taken by the four imaging devices together into an Omni-directional image, allowing simultaneous viewing of a 360 degree field of view surrounding the mobile platform. Another feature is that the system comprises a touch screen GUI and the robotic mobile platform is equipped with processing means and appropriate software. This combination enables the user to steer the robotic platform simply by touching an object in one of the displayed images that he wants to investigate. The robotic platform can then either point its sensors towards that object or, if so instructed, compute the direction to the object and travel to it without any further input from the user.07-15-2010
20100174409Robot slip detection apparatus and method - A technique of detecting a slip of a robot using a particle filter and feature information of a ceiling image is disclosed. A first position of the robot is computed using a plurality of particles, a second position of the robot is computed using the feature information of the ceiling image, and whether a slip has occurred is determined based on a distance between the first position and the second position.07-08-2010
20100049368ROBOT - An exemplary robot includes an information collecting module, a controlling system and a driving module. The information collecting module comprises a voice identifying device, a detecting device and a motion sensing device. The information collecting module is configured for identifying identities of robot users, detecting distances between the robot and objects located therearound thereof and sensing motion states of the robot. The controlling system is configured for generating a controlling signal and sending the controlling signal to the driving module. The driving module is configured for receiving the controlling signal, and driving the robot to move and adjusting the movement of the robot based on the controlling signal.02-25-2010
20100049367METHOD OF CONTROLLING ROBOT FOR BRIDGE INSPECTION - The present invention relates to a method of controlling a robot for bridge inspection. In the present invention, whether a defect image is being received from a robot device is determined. As a result of the determination, when the defect image is being received, a current location of the robot device is stored. Whether a predetermined period of time has been elapsed after the storage of the current location is determined. When the predetermined period of time has elapsed, a control command for moving the robot device to a prestored location is output. Whether a defect image at a same location as the prestored location is being received is determined. When the defect image at the same location is being received, a defect image at a previous time is compared with a defect image at a current time. A result of the comparison is displayed.02-25-2010
20100298978MANIPULATOR WITH CAMERA - Provided is a manipulator with at least one camera capable of observing an end effector from a direction suitable for work. A rotating portion rotatable coaxially with the end effector is provided to a link adjacent to a link located at a manipulator tip end. At least one camera for recognizing a work piece as a object is arranged on the rotating portion through a camera platform. An actuator for controlling a rotation angle of the rotating portion is driven according to a rotation angle of the link located at the manipulator tip end, and thus the camera is arranged in a direction perpendicular to a plane where end effector can move when the end effector performs a grip work. In an assembly work, the rotating portion is rotated such that the camera is arranged in a direction parallel to the plane where end effector can move.11-25-2010
20100298977MOBILE ROBOT AND PATH PLANNING METHOD THEREOF FOR MANIPULATING TARGET OBJECTS - A mobile robot and a path planning method are provided for the mobile robot to manipulate the target objects in a space, wherein the space consists of a periphery area and a central area. With the present method, an initial position is defined and the mobile robot is controlled to move within the periphery area from the initial position. Next, the latest image is captured when the mobile robot moves, and a manipulating order is arranged according to the distances estimated between the mobile robot and each of target objects in the image. The mobile robot is controlled to move and perform a manipulating action on each of the target object in the image according to the manipulating order. The steps of obtaining the image, planning the manipulating order, and controlling the mobile robot to perform the manipulating action are repeated until the mobile robot returns to the initial position.11-25-2010
20090105882Medical Tele-Robotic System - A robotic system that includes a remote controlled robot. The robot may include a camera, a monitor and a holonomic platform all attached to a robot housing. The robot may be controlled by a remote control station that also has a camera and a monitor. The remote control station may be linked to a base station that is wirelessly coupled to the robot. The cameras and monitors allow a care giver at the remote location to monitor and care for a patient through the robot. The holonomic platform allows the robot to move about a home or facility to locate and/or follow a patient.04-23-2009
20100010672Docking system for a tele-presence robot - A remote controlled robot system that includes a mobile robot with a robot camera and a battery plug module, and a remote control station that transmits commands to control the mobile robot. The system also includes a battery charging module that mates with the mobile robot battery plug module, and an alignment system that aligns the battery plug module with the battery charging module. The battery modules may also be aligned with the aid of video images of the battery charging module provided to the remote station by a camera located within the battery plug module.01-14-2010
20100268385MOVING ROBOT AND OPERATING METHOD FOR SAME - There are provided a moving robot and a method of operating the same. A bottom surface is photographed to sense a moving distance and a moving direction based on input image data. The amount of light radiated to photograph the bottom surface is sensed to feedback control the light emission degree of a light source unit. The light source unit is controlled when errors are generated in sensing the image data. Therefore, the sensing ratio of the photographed image is improved so that correctness of calculating the position of the moving robot is improved.10-21-2010
20120226382ROBOT-POSITION DETECTING DEVICE AND ROBOT SYSTEM - A robot-position detecting device includes: a position-data acquiring unit that acquires position data indicating actual positions of a robot; a position-data input unit that receives the position data output from the position-data acquiring unit; and a position calculating unit that calculates a computational position of the robot through linear interpolation using first and second position data input to the position-data input unit at different times.09-06-2012
20120259465CLEANING SYSTEM - A cleaning system including a first virtual wall, a second virtual wall and a cleaning robot is disclosed. The first virtual wall includes a first specific pattern. When a light emits the first specific pattern, a first specific reflected light is generated. The second virtual wall includes a second specific pattern. When the light emits the second specific pattern, a second specific reflected light is generated. The cleaning robot, based on the first and the second specific reflected lights, obtains and records positions of the first and the second virtual walls. The cleaning robot defines a first virtual line according to the recorded positions. A traveling path of the cleaning robot is limited by the first virtual line.10-11-2012
20120265345ROBOT SYSTEM AND PROCESSED OBJECT MANUFACTURING METHOD - In this robot system, a control portion is configured to control a robot to grasp an object to be grasped by a grasping portion, and control a first imaging portion to examine the object to be grasped while driving a robot arm to change a posture of the object to be grasped multiple times.10-18-2012
20120265343AUTONOMOUS COVERAGE ROBOT SENSING - An autonomous coverage robot detection system includes an emitter configured to emit a directed beam, a detector configured to detect the directed beam and a controller configured to direct the robot in response to a signal detected by the detector. In some examples, the detection system detects a stasis condition of the robot. In some examples, the detection system detects a wall and can follow the wall in response to the detected signal.10-18-2012
20120265346AUTONOMOUS COVERAGE ROBOT SENSING - An autonomous coverage robot detection system includes an emitter configured to emit a directed beam, a detector configured to detect the directed beam and a controller configured to direct the robot in response to a signal detected by the detector. In some examples, the detection system detects a stasis condition of the robot. In some examples, the detection system detects a wall and can follow the wall in response to the detected signal.10-18-2012
20120323366MANIPULATOR WITH CAMERA - Provided is a manipulator with at least one camera capable of observing an end effector from a direction suitable for work. A rotating portion rotatable coaxially with the end effector is provided to a link adjacent to a link located at a manipulator tip end. At least one camera for recognizing a work piece as a object is arranged on the rotating portion through a camera platform. An actuator for controlling a rotation angle of the rotating portion is driven according to a rotation angle of the link located at the manipulator tip end, and thus the camera is arranged in a direction perpendicular to a plane where end effector can move when the end effector performs a grip work. In an assembly work, the rotating portion is rotated such that the camera is arranged in a direction parallel to the plane where end effector can move.12-20-2012
20120323365DOCKING PROCESS FOR RECHARGING AN AUTONOMOUS MOBILE DEVICE - Described herein are technologies pertaining to autonomously docking a mobile robot at a docking station for purposes of recharging batteries of the mobile robot. The mobile robot uses vision-based navigation and a known map of the environment to navigate toward the docking station. Once sufficiently proximate to the docking station, the mobile robot captures infrared images of the docking station, and granularly aligns itself with the docking station based upon the captured infrared images of the docking station. As the robot continues to drive towards the docking station, the robot monitors infrared sensors for infrared beams emitted from the docking station. If the infrared sensors receive the infrared beams, the robot continues to drive forward until the robot successfully docks with the docking station.12-20-2012
20110218675ROBOT SYSTEM COMPRISING VISUAL SENSOR - A robot system (09-08-2011
20100234998MOVING ROBOT AND OPERATING METHOD THEREOF - A moving robot and its operation method are disclosed. The moving robot includes a moving body/object sensing unit that senses a movement of a human body within a certain distance, a traveling unit that controls a traveling speed and direction, and a controller that outputs a control signal for controlling the traveling speed according to pre-set data to the traveling unit. In a state that the moving robot performs a cleaning operation while moving its locations, when a movement of a human body is sensed by the moving body/object sensing unit, the traveling speed is reduced to allow the user to easily control the external operation, and the efficiency can be increased by utilizing the moving body/object sensing unit for operations of different modes.09-16-2010
20120095597ROBOT CLEANER - Provided is a robot cleaner, and more particularly to a robot cleaner which detest whether a foreign material storage unit is separated. The robot cleaner includes a main body including a suction motor, a foreign material storage unit separably disposed within the main body, the foreign material storage unit storing foreign materials contained in sucked air, a foreign material cover for selectively shielding one side of the foreign material storage unit, and a detection unit for detecting whether the foreign material cover is opened.04-19-2012
20100168918OBTAINING FORCE INFORMATION IN A MINIMALLY INVASIVE SURGICAL PROCEDURE - Methods of and a system for providing force information for a robotic surgical system. The method includes storing first kinematic position information and first actual position information for a first position of an end effector; moving the end effector via the robotic surgical system from the first position to a second position; storing second kinematic position information and second actual position information for the second position; and providing force information regarding force applied to the end effector at the second position utilizing the first actual position information, the second actual position information, the first kinematic position information, and the second kinematic position information. Visual force feedback is also provided via superimposing an estimated position of an end effector without force over an image of the actual position of the end effector. Similarly, tissue elasticity visual displays may be shown.07-01-2010
20130013112Constrained Resolved Acceleration Control - A system, method, and computer program product for controlling an articulated system are described. The system estimates kinematics of body segments of the articulated system and constructs a weighted pseudo-inverse matrix to enforce kinematic constraints as well as achieve dynamic consistency based on the estimated kinematics. The system converts task descriptors to joint commands using the weighted pseudo-inverse matrix and controls the articulated system at both the velocity level and the acceleration level and enforces kinematic constraints using the joint commands.01-10-2013
20100161129SYSTEM AND METHOD FOR ADJUSTING AN IMAGE CAPTURING DEVICE ATTRIBUTE USING AN UNUSED DEGREE-OF-FREEDOM OF A MASTER CONTROL DEVICE - An image capturing device is robotically positioned and oriented in response to operator manipulation of a master control device. An unused degree-of-freedom of the master control device is used to adjust an attribute such as focusing of the image capturing device relative to a continually updated set-point. A deadband is provided to avoid inadvertent adjusting of the image capturing device attribute and haptic feedback is provided back to the master control device so that the operator is notified when adjusting of the attribute is initiated.06-24-2010
20130024025Autonomous Robot and A Positioning Method Thereof - An autonomous robot and a positioning method thereof are disclosed. The autonomous robot includes an environment information detection device, a map construction module, a setting module, a path planning module, and a driving module. The environment information detection device is for detecting environment information about an environment where the autonomous robot is situated. An environment map is constructed based on the environment information detected by the environment information detection device. The setting module is used for setting a working boundary on the environment map. The path planning module is for planning a moving path in a working zone and is electrically connected to the setting module. The driving module for driving the autonomous robot to move along the moving path is electrically connected to the path planning module.01-24-2013
20080221734Categorical Color Perception System - The present invention relates to a categorical color perception system which automatically judges a categorical color and aims to judge a categorical color name correctly under various ambient lights. Test color measured at an experiment is inputted to an input layer portion corresponding to test color components 09-11-2008
20130123987ROBOTIC SYSTEM, ROBOT CONTROL METHOD AND ROBOT CONTROL PROGRAM - A robotic system includes: a detection unit that detects at least one of a voice, light and an image of a content outputted by a content output device; a decision unit that assesses information detected by the detection unit on the basis of reference data so as to assess the content outputted by the content output device; and a control unit that controls a behavior or a state of the robotic system on the basis of the assessment made by the decision unit.05-16-2013
20130123986METHOD AND APPARATUS FOR TISSUE TRANSFER - A handheld tool is disclosed which may be used to transfer a plurality of plant tissue explants from a first container to a second container. The handheld tool may include a disposable tip member which couples the plurality of plant tissue explants through use of negative pressure. An automated system which transfers a plurality of plant tissue explants from a first container to a second container is also disclosed. The automated system may include a first presentment system which moves the first container to a region, a second presentment system which moves the second container to the region, and a robot system that transfers the plurality of plant tissue explants from the first container to the second container.05-16-2013
20130123985TRANSPARENT OBJECT DETECTION SYSTEM AND TRANSPARENT FLAT PLATE DETECTION SYSTEM - A disclosed transparent body detection system includes an image acquisition unit acquiring a vertical polarization image and a horizontal polarization image by acquiring an image of a first region, the image including a transparent body having characteristics in which a polarization direction of transmission light changes; a placing table on which the transparent body is to be placed; a polarization filter disposed opposite to the image acquisition unit across the placing table and at a position including a second region, an image of the second region including at least the transparent body in the first region and being acquired; and an image processing apparatus detecting the transparent body based on distribution of vertical/lateral polarization degree of a vertical/lateral polarization degree image based on the vertical polarization image and the horizontal polarization image.05-16-2013
20130131866Protocol for a Remotely Controlled Videoconferencing Robot - A robotic system that includes a robot and a remote station. The remote station can generate control commands that are transmitted to the robot through a broadband network. The control commands can be interpreted by the robot to induce action such as robot movement or focusing a robot camera. The robot can generate reporting commands that are transmitted to the remote station through the broadband network. The reporting commands can provide positional feedback or system reports on the robot.05-23-2013
20130144438OPTICAL POSITION DETECTING DEVICE, ROBOT HAND, AND ROBOT ARM - An optical position detecting device includes a plurality of light source sections which emits detection light, a light detection section which receives the detection light reflected by a target object located in an emitting space of the detection light, a light source driving section which turns on some light source sections among the plurality of light source sections in a first period and turns on, in a second period, light source sections different from the light source sections turned on in the first period, and a position detecting section which detects the position of the target object on the basis of a light-receiving result of the light detection section in the first period and the second period. Each of the light source sections includes a plurality of light-emitting elements arrayed in a direction intersecting the direction of the optical axis of the detection light.06-06-2013
20080201017Medical tele-robotic system - A robotic system that includes a remote controlled robot. The robot may include a camera, a monitor and a holonomic platform all attached to a robot housing. The robot may be controlled by a remote control station that also has a camera and a monitor. The remote control station may be linked to a base station that is wirelessly coupled to the robot. The cameras and monitors allow a care giver at the remote location to monitor and care for a patient through the robot. The holonomic platform allows the robot to move about a home or facility to locate and/or follow a patient.08-21-2008
20080201016Robot and Method of Registering a Robot - A robot has a controllable arm which carries an instrument or tool. The robot is provided with a camera to obtain an image of a work piece, including images of markers and an indicator present on the work piece. The robot processes the images to determine the position of the markers within a spatial frame of reference. The robot is controlled to effect predetermined movements of the instrument or tool relative to the work piece. The processor is further configured to determine the position of the indicator and to respond to movement of the indicator within the spatial frame of reference of the robot when the markers are concealed to determine a new position of the indicator and thus the new position of the work piece. Subsequently, the robot is controlled, relative to the new position of the work piece, to effect predetermined movements relative to the work piece.08-21-2008
20100286827ROBOT WITH VISION-BASED 3D SHAPE RECOGNITION - The invention relates to a method for processing video signals from a video sensor, in order to extract 3d shape information about objects represented in the video signals, the method comprising the following steps: 11-11-2010
20100312393ROBOT WITH CAMERA - A shutter chance of a camera mounted on a robot arm is optimized for improving assembling efficiency. An image of a point on a finger and a position of an alignment mark is taken, and a position of the camera is measured by image processing. When the position of the camera is a preset position threshold value or lower, and when a velocity of the camera measured by a velocity sensor is a preset velocity threshold value or lower, a shutter is released. Furthermore, logical AND with another condition that a acceleration of the camera measured as a differential value of the velocity sensor is a preset acceleration threshold value or lower is taken for releasing the shutter. Thus, blur of an image for searching for a work is prevented, a position error is reduced, and working efficiency is improved by releasing the shutter earlier.12-09-2010
20130158710TAKING OUT DEVICE HAVING FUNCTION FOR CORRECTING POSTURE OF AN ARTICLE - A taking out device capable of correcting a posture of an article to be taken out and taking out the article, while considering interference between a robot hand and a container for containing the article. Since the article is inclined to the left side, the hand approaches and contacts the article from the left side. Then, the hand pushes to the right side while claws of the hand engage a hole portion of the article in order to correct the posture of the article such that the positional relationship between the article and the hand represents a reference position/posture. In this way, the hand is positioned at a second position/posture in which the posture of the article relative to the claws allows the article to be taken out.06-20-2013
20130158711ACOUSTIC PROXIMITY SENSING - An acoustic pretouch sensor or proximity sensor (06-20-2013
20110282492METHOD OF CONTROLLING A ROBOTIC TOOL - A method of controlling a robot system includes the steps of providing a tool supported by a moveable mechanism of the robot system, providing a workpiece supported by a holder, generating an image of the workpiece, extracting a data from the image, the data relating to a feature of the workpiece, generating a continuous three-dimensional path along the workpiece using data extracted from the image, and moving the tool along the path.11-17-2011
20120290134ESTIMATION OF A POSITION AND ORIENTATION OF A FRAME USED IN CONTROLLING MOVEMENT OF A TOOL - A robotic system includes a camera having an image frame whose position and orientation relative to a fixed frame is determinable through one or more image frame transforms, a tool disposed within a field of view of the camera and having a tool frame whose position and orientation relative to the fixed frame is determinable through one or more tool frame transforms, and at least one processor programmed to identify pose indicating points of the tool from one or more camera captured images, determine an estimated transform for an unknown one of the image and tool frame transforms using the identified pose indicating points and known ones of the image and tool frame transforms, update a master-to-tool transform using the estimated and known ones of the image and tool frame transforms, and command movement of the tool in response to movement of a master using the updated master-to-tool transform.11-15-2012
20120004775ROBOT APPARATUS AND CONTROL METHOD THEREFOR - A robot apparatus includes a robot mechanism having a plurality of joints, and actuators that drive joint axes of the robot mechanism. The robot apparatus includes a robot controller that controls the driving of the actuators based on a cost function that is a function of torque reference inputs for the actuators.01-05-2012
20120022691AUTOMATED POSITIONING OF AN ORGANIC POLARIZED OBJECT - A method, system and apparatus to position an organic polarized object to a predetermined orientation and a predetermined location are provided. In an embodiment, an image of the organic polarized object is captured through an image capture device. The image of the organic polarized object is converted to an image data set. This image data set if further converted to a dimension data set. A first location and a first orientation of the organic polarized object are determined through a processor. A pressure is applied to secure organic polarized object. The organic polarized object is secured through a robotic end effector and may be moved to a predetermined location and a predetermined orientation. The organic polarized object is adjusted to the predetermined orientation. The organic polarized object is positioned at a predetermined location. The predetermined location and predetermined orientation may be selected by a user.01-26-2012
20080312771Robots with Occlusion Avoidance Functionality - A method for controlling a robot having at least one visual sensor. A target for a motion of the robot is defined. A motion control signal adapted for the robot reaching the target is calculated. A collision avoidance control signal based on the closest points of segments of the robot and a virtual object between the visual sensing means and the target is calculated. The motion control signal and the collision avoidance control signal are weighted and combined. The weight of the motion control signal is higher when a calculated collision risk is lower. The motion of the robot is controlled according to the combined signal so that no segment of the robot enters the space defined by the virtual object.12-18-2008
20130197696ROBOT APPARATUS, ASSEMBLING METHOD, AND RECORDING MEDIUM - A robot apparatus includes a gripping unit configured to grip a first component, a force sensor configured to detect, as detection values, a force and a moment acting on the gripping unit, a storing unit having stored therein contact states of the first component and a second component and transition information in association with each other, a selecting unit configured to discriminate, on the basis of the detection values, a contact state of the first component and the second component and select, on the basis of a result of the discrimination, the transition state stored in the storing unit, and a control unit configured to control the gripping unit on the basis of the transition information selected by the selecting unit.08-01-2013
20080281470AUTONOMOUS COVERAGE ROBOT SENSING - An autonomous coverage robot detection system includes an emitter configured to emit a directed beam, a detector configured to detect the directed beam and a controller configured to direct the robot in response to a signal detected by the detector. In some examples, the detection system detects a stasis condition of the robot. In some examples, the detection system detects a wall and can follow the wall in response to the detected signal.11-13-2008
20130204436APPARATUS FOR CONTROLLING ROBOT AND CONTROL METHOD THEREOF - An apparatus for controlling a robot capable of controlling the motion of the arm of the robot, and a control method thereof, the apparatus including an image obtaining unit configured to obtain a three-dimensional image of a user, a driving unit configured to drive an arm of the robot that is composed of a plurality of segments, and a control unit configured to generate a user model that corresponds to a motion of the joint of the user based on the three-dimensional image, to generate a target model having a length of the segment that varies based on the user model, and to allow the arm of the robot to be driven based on the target model.08-08-2013
20130204437AGRICULTURAL ROBOT SYSTEM AND METHOD - An agricultural robot system and method of harvesting, pruning, culling, weeding, measuring and managing of agricultural crops. Uses autonomous and semi-autonomous robot(s) comprising machine-vision using cameras that identify and locate the fruit on each tree, points on a vine to prune, etc., or may be utilized in measuring agricultural parameters or aid in managing agricultural resources. The cameras may be coupled with an arm or other implement to allow views from inside the plant when performing the desired agricultural function. A robot moves through a field first to “map” the plant locations, number and size of fruit and approximate positions of fruit or map the cordons and canes of grape vines. Once the map is complete, a robot or server can create an action plan that a robot may implement. An action plan may comprise operations and data specifying the agricultural function to perform.08-08-2013
20130211594Proxy Robots and Remote Environment Simulator for Their Human Handlers - A system for controlling a human-controlled proxy robot surrogate is presented. The system includes a plurality of motion capture sensors for monitoring and capturing all movements of a human handler such that each change in joint angle, body posture or position; wherein the motion capture sensors are similar in operation to sensors utilized in motion picture animation, suitably modified to track critical handler movements in near real time. A plurality of controls attached to the proxy robot surrogate is also presented that relays the monitored and captured movements of the human handler as “follow me” data to the proxy robot surrogate in which the plurality of controls are configured such that the proxy robot surrogate emulates the movements of the human handler.08-15-2013

Patent applications in class Vision sensor (e.g., camera, photocell)