Main Article Content
Correspondence of line segments between two perpective images: comparaison between Epipolar, Bayesian and Neuronal approaches
Abstract
In order to permit the localization and the navigation of a mobile robot within an interior environment, we have built a stereoscopic sensor and implemented all the algorithms which allow to obtain 3D coordinates of real objects from data images. Sensor uses two mini cameras with vertical disposition. Processing on the images leads us to have segments on the two images. To match segments, we have to answer at the question: “Is there two segments, in a pair from top and bottom images segments, are two views of the same part of the real world of the robot ?”. For choosing the best matching algorithm, we compare three approaches: epipolar , Bayesian and neuronal techniques. In the epipolar method, we use only geometrical features. Two images of a single scene/object are related by the epipolar geometry, which can be describe by a 3x3 singular matrix called the fundamental matrix. It captures all geometric information contained in two images, and its determination is very important in many applications such as scene modeling and navigation of a mobile robot. With the two others, we associate to each segment, a set of 16 parameters which includes geometrical, gray level, textural and neighborhood features. For the comparison of these approaches, we have built a database of over 2500 segments. Results about this work are presented in our paper. Results with the two techniques from the data classification, are very good. In order to increase the quality of the segments matching, it is possible to find a combination rule for these two approaches. But, even if for the epipolar method 22% segments are not recognized, we think it is interesting to introduce it with the aim to obtain better results in some difficult configurations, by example when gray levels and textures are not very different in the scene, and also in order to initialize the classification algorithm.
KEY WORDS: .
Global Jnl Pure & Applied Sciences Vol.10(2) 2004: 335-341
KEY WORDS: .
Global Jnl Pure & Applied Sciences Vol.10(2) 2004: 335-341