We investigate a solution to the problem of multisensor\ud perception and tracking by formulating it in\ud the framework of Bayesian model selection. Humans\ud robustly associate multi-sensory data as appropriate,\ud but previous theoretical work has focused\ud largely on purely integrative cases, leaving\ud segregation unaccounted for and unexploited by\ud machine perception systems. We illustrate a unifying,\ud Bayesian solution to multi-sensor perception\ud and tracking which accounts for both integration\ud and segregation by explicit probabilistic reasoning\ud about data association in a temporal context. Unsupervised\ud learning of such a model with EM is illustrated\ud for a real world audio-visual application
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.