ECOVACS ROBOTICS (SUZHOU ) CO., LTD. Patent applications |
Patent application number | Title | Published |
20150278889 | GUIDE ROBOT FOR SHOPPING GUIDING SYSTEM AND METHOD - An intelligent shopping guiding system comprising a guide robot and a workstation computer is disclosed. The guide robot is coupled to the workstation computer, preferably over a wireless system and information between the guide robot and the workstation computer is transferred via wireless communication. As the front-end equipment of the shopping guiding system, the guide robot interacts with the customer and provides assistance and guidance to the customer to enhance their shopping experience. As the back-end equipment of the shopping guiding system, the workstation computer builds, stores, and maintains a customized message associated with a customer's unique ID number. The unique ID number is associated with the customer's personal characteristics such as, for example, biometric information, physiological, or other information suitable for identifying the customer. | 10-01-2015 |
20150089768 | VERTICAL VACUUM CLEANER - The present invention discloses an upright vacuum cleaner which comprises a cleaner body ( | 04-02-2015 |
20140202495 | GLASS WIPING DEVICE AND CONTROL METHOD THEREOF - A glass wiping device and a control method thereof are provided. The glass wiping device comprising a driver mechanism ( | 07-24-2014 |
20130291334 | CYCLONE SEPARATION DEVICE AND CYCLONE VACUUM CLEANER MOUNTED WITH SAME - A cyclone separation device ( | 11-07-2013 |
20130221908 | INTELLIGENT ROBOT SYSTEM AND DOCKING METHOD FOR CHARGING SAME - An intelligent robot system comprising an intelligent robot ( | 08-29-2013 |
20120232696 | Autonomous Moving Floor-Treating Robot and Control Method Thereof for Edge-Following Floor-Treating - An autonomous moving floor-treating robot and a control method thereof for edge-following floor-treating are provided. The control method includes the following steps: the floor-treating robot collides with an obstacle and is deflected toward the direction away from the obstacle by a basic angle after the collision, measures an initial signal strength value by a side-looking sensor after the deflection, and then moves on and treats the floor; a real-time signal strength value is acquired by said side-looking sensor alter the robot runs for a predetermined time; the difference value between said two signal strength values is compared, and whether the difference value is in a predetermined range is judged, if yes, the robot keeps moving and treating the floor, if not, the robot is driven to be deflected by an adjusting angle and acquires the current real-time signal strength value; the difference value between said current and the last real-time signal strength values is compared, and whether the difference value is in a predetermined range is judged, if yes, the robot keeps moving and treating the floor, if not, the steps of deflection, comparing and so on are implemented. The present invention is unaffected by the media of the obstacle, and can effectively treat the edge region of the ti obstacle. | 09-13-2012 |
20120103367 | CLEANING ROBOT, DIRT RECOGNITION DEVICE THEREOF AND CLEANING METHOD OF ROBOT - A cleaning robot a dirt recognition device thereof and a cleaning method of the robot are disclosed. The recognition device includes an image collecting module and an image processing module. The image collecting module may be used for collecting the image information of the surface to be treated by the cleaning robot and sending the image information to the image processing module. The image processing module may divide the collected image information of the surface to be treated into N blocks, extract the image information of each block and process the image information in order to determine the dirtiest surface to be treated that corresponds to one of the N blocks. Through the solution provided by the present invention, the cleaning robot can make an active recognition to the dirt such as dust, so that it can get into the working area accurately and rapidly. | 05-03-2012 |