Estimation of the number of signals in the presence of noise is an important
problem in several areas of statistical signal processing. There are a number
of modern works on the design of an optimal solution to this problem in terms
of some criteria. Each criterion generates a model order selection (MOS)
algorithm. However, the minimum error probability criterion has not received
significant attention, although errors in the estimation of the number of
signals might directly affect the performance of the signal processing system
as a whole. In this paper, we propose a new approach to the design of MOS
algorithms partially based on the minimum error probability criterion. Also, we
pay a lot of attention to the performance and consistency analysis of the MOS
algorithms. In this study, an abridged error probability is used as a universal
performance measure of the MOS algorithms. We propose a theoretical framework
that makes it possible to obtain closed-form expressions for the abridged error
probabilities of a wide range of MOS algorithms. Moreover, a parametric
consistency analysis of the presented MOS algorithms is provided. Using the
obtained results, we provide a parametric optimization of the presented MOS
algorithms. Finally, we examinate a quasilikelihood (QL) approach to the design
and analysis of the MOS algorithms. The proposed theoretical framework is used
to obtain the abridged error probabilities as functions of the unknown signal
parameter. These functions, in turn, allow us to find the scope of the QL
approach.Comment: improved presentatio