Kentaro Toyama and Eric Horvitz
Redmond, Washington 98052-6399
Access pdf file.
We describe a head-tracking system that harnesses Bayesian modality fusion, a technique for integrating the analyses of multiple visual tracking algorithms within a probabilistic framework. At the heart of the approach is a Bayesian network model that includes random variables that serve as context-sensitive indicators of reliability of the different tracking algorithms. Parameters of the Bayesian model are learned from data in an offline training phase using ground-truth data from a Polhemus tracking device. In our implementation for a real-time head tracking task, algorithms centering on color, motion, and background subtraction modalities are fused into a single estimate of head position in an image. Results demonstrate the effectiveness of Bayesian modality fusion in environments undergoing a variety of visual perturbances.
Keywords: Head tracking, probability, sensor fusion, reliability, vision algorithms.
In: Proceedings of ACCV '00, Fourth Asian Conference on Computer Vision, January 2000, Tapei, Taiwan.
Author Email: firstname.lastname@example.org and email@example.com