20,711 research outputs found
Recommended from our members
A quasi-current representation for information needs inspired by Two-State Vector Formalism
Recently, a number of quantum theory (QT)-based information retrieval (IR) models have been proposed for modeling session search task that users issue queries continuously in order to describe their evolving information needs (IN). However, the standard formalism of QT cannot provide a complete description for usersâ current IN in a sense that it does not take the âfutureâ information into consideration. Therefore, to seek a more proper and complete representation for usersâ IN, we construct a representation of quasi-current IN inspired by an emerging Two-State Vector Formalism (TSVF). With the enlightenment of the completeness of TSVF, a âtwo-state vectorâ derived from the âfutureâ (the current query) and the âhistoryâ (the previous query) is employed to describe usersâ quasi-current IN in a more complete way. Extensive experiments are conducted on the session tracks of TREC 2013 & 2014, and show that our model outperforms a series of compared IR models
A Maximum Entropy Procedure to Solve Likelihood Equations
In this article, we provide initial findings regarding the problem of solving likelihood equations by means of a maximum entropy (ME) approach. Unlike standard procedures that require equating the score function of the maximum likelihood problem at zero, we propose an alternative strategy where the score is instead used as an external informative constraint to the maximization of the convex Shannon\u2019s entropy function. The problem involves the reparameterization of the score parameters as expected values of discrete probability distributions where probabilities need to be estimated. This leads to a simpler situation where parameters are searched in smaller (hyper) simplex space. We assessed our proposal by means of empirical case studies and a simulation study, the latter involving the most critical case of logistic regression under data separation. The results suggested that the maximum entropy reformulation of the score problem solves the likelihood equation problem. Similarly, when maximum likelihood estimation is difficult, as is the case of logistic regression under separation, the maximum entropy proposal achieved results (numerically) comparable to those obtained by the Firth\u2019s bias-corrected approach. Overall, these first findings reveal that a maximum entropy solution can be considered as an alternative technique to solve the likelihood equation
Symmetric measures via moments
Algebraic tools in statistics have recently been receiving special attention
and a number of interactions between algebraic geometry and computational
statistics have been rapidly developing. This paper presents another such
connection, namely, one between probabilistic models invariant under a finite
group of (non-singular) linear transformations and polynomials invariant under
the same group. Two specific aspects of the connection are discussed:
generalization of the (uniqueness part of the multivariate) problem of moments
and log-linear, or toric, modeling by expansion of invariant terms. A
distribution of minuscule subimages extracted from a large database of natural
images is analyzed to illustrate the above concepts.Comment: Published in at http://dx.doi.org/10.3150/07-BEJ6144 the Bernoulli
(http://isi.cbs.nl/bernoulli/) by the International Statistical
Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm
- âŠ