Traditionally, vegetable and fruit production has relied on empirical and ambiguous decisions made by human farmers. To overcome this uncertainty in agriculture, smart farm robots have been widely studied in recent years. However, measuring growth information with robots remains a challenge because of the similarity in the appearance of the target plant and those around it. We propose a smart farm robot that accurately measures the growth information of a target plant based on object detection, image fusion, and data augmentation with fused images. The proposed smart farm robot uses an end-to-end real-time deep learning-based object detector that shows state-of-the-art performances. To distinguish the target plant from other plants with a higher accuracy and improved robustness than those of existing methods, we exploited image fusion using both RGB and depth images. In particular, the data augmentation, based on the fused RGB, and depth information, contributes to the precise measurement of growth information from smart farms, regardless of the high density of vegetables and fruits in these farms. We propose and evaluate a real-time measurement system to obtain precise target-plant growth information in precision agriculture.
AIDBOT (’20-’21)
Dr. KangGeon Kim from the Artificial Intelligence and Robotics Institute developed the “AIDBOT,” an artificial intelligence (AI)-based disinfection robot, using available KIST technologies. The AIDBOT is able to move around inside a building which is contaminated with pathogens dangerous to humans and disinfect the building, entirely on its own. Unlike other disinfection robots, it simultaneously uses disinfectants and ultraviolet (UV) rays and can move autonomously.
A key feature of the AIDBOT is that it disinfects by emitting UV rays from its trunk and spraying disinfectants from its upper section, and does not require humans to track and manually control it. One can simply monitor it from a control center and check that it is functioning properly. The robot uses AI to create a three-dimensional map of a building’s interior and disinfects the building on its own. It also intensively disinfects objects that humans frequently come into contact with, such as doorknobs. The depth camera and LiDAR sensor allow it to sense human movements and disinfect without bumping into obstacles. Since direct exposure to UV rays at a close distance is harmful to humans, the robot stops performing UV disinfection when it senses an approaching person.
As its name suggests, the AIDBOT is an AI-based robot that aids humans with disinfection tasks. While disinfection may be faster if done by humans, the AIDBOT eliminates the need for human labor, reduces the risk of exposure to infections, and can be continuously used. It can even perform disinfection tasks in place of humans in multipurpose facilities, such as hospitals, public facilities, and schools, thereby protecting the health and livelihood of our citizens.
Mahru Z and M (’08-’11)
Developed with highly advanced technology, Mahru-Z can walk with self-regulation and find targets to operate various household machines using its own two hands and feet. Mahru-Z can also imitate a human’s full-body movement using a motion-capture system.
Different robots can be assigned to different tasks. For example, Mahru-Z can put fruit in a basket and place it on the dinner table, while Mahru-M, which has a mobility advantage, can locate the owner and bring him the fruit directly.
Mahru-Z is compelling because it created the market for working service robots that can live and interact with humans. Utilizing its real-time remote control technology, Mahru-Z can be used under conditions that are too difficult or dangerous for humans.