This paper presents a novel approach to visual localisation that uses a camera on the robot coupled wirelessly to an external RGB-D sensor. Unlike systems where an external sensor observes the robot, our approach merely assumes the robots camera and external sensor share a portion of their field of view. Experiments were performed using a Microsoft Kinect as the external sensor and a small mobile robot. The robot carries a smartphone, which acts as its camera, sensor processor, control platform and wireless link. Computational effort is distributed between the smartphone and a host PC connected to the Kinect. Experimental results show that the approach is accurate and robust in dynamic environments with substantial object movement and occlusions. This work won the best student paper prize at ACRA 2011.
[ACRA 2011 paper]
[ACRA 2011 paper]
No comments:
Post a Comment
Note: only a member of this blog may post a comment.