Christian Colliander presenterar sitt examensarbete
As digital cameras become cheaper and better, computers more powerful, and robots more abundant the merging of these three techniques also becomes more common and capable. The combination of these techniques is often inspired by the human visual system and often strives to give machines the same capabilities that human already have, like object identification, navigation, limb coordination, and event detection.
One such field that is particularly popular is that of SLAM, or Simulta- neous Localization and Mapping, which has some pretty high-profile applications in self-driving cars and delivery drones.
This thesis suggests an online SLAM algorithm for a specific scenario: that of a robot with a downwards facing camera exploring a flat surface (e.g. a floor). The method is based on building homographies from robot odometry data, which are then used to rectify the images so that the tilt of the camera w.r.t. the floor is eliminated, thereby moving the problem from 3D to 2D. The 2D pose of the robot in the plane is estimated using registrations of SURF features, and then a bundle adjustment algorithm is used to consoli- date the most recent measurements with the older ones in order to optimize the map. The algorithm is implemented and tested with an AR.Drone 2.0 quadcopter. The results are mixed, but hardware seems to be the limiting factor: the algorithm performs well and runs at 5 ́10 Hz on a i5 desktop computer; but the bad quality, high compression and low resolution of the drone’s bottom camera makes the algorithm unstable and this cannot be overcome, even with several tiers of outlier filtering.
Examinator: Anders Heyden
Handledare: Magnus Oskarsson