496 research outputs found

    Quantum Continuum Mechanics Made Simple

    Full text link
    In this paper we further explore and develop the quantum continuum mechanics (CM) of [Tao \emph{et al}, PRL{\bf 103},086401] with the aim of making it simpler to use in practice. Our simplifications relate to the non-interacting part of the CM equations, and primarily refer to practical implementations in which the groundstate stress tensor is approximated by its Kohn-Sham version. We use the simplified approach to directly prove the exactness of CM for one-electron systems via an orthonormal formulation. This proof sheds light on certain physical considerations contained in the CM theory and their implication on CM-based approximations. The one-electron proof then motivates an approximation to the CM (exact under certain conditions) expanded on the wavefunctions of the Kohn-Sham (KS) equations. Particular attention is paid to the relationships between transitions from occupied to unoccupied KS orbitals and their approximations under the CM. We also demonstrate the simplified CM semi-analytically on an example system

    Detecting stochastic dominance for poset-valued random variables as an example of linear programming on closure systems

    Get PDF
    In this paper we develop a linear programming method for detecting stochastic dominance for random variables with values in a partially ordered set (poset) based on the upset-characterization of stochastic dominance. The proposed detection-procedure is based on a descriptively interpretable statistic, namely the maximal probability-difference of an upset. We show how our method is related to the general task of maximizing a linear function on a closure system. Since closure systems are describable via their valid formal implications, we can use here ingredients of formal concept analysis. We also address the question of inference via resampling and via conservative bounds given by the application of Vapnik-Chervonenkis theory, which also allows for an adequate pruning of the envisaged closure system that allows for the regularization of the test statistic (by paying a price of less conceptual rigor). We illustrate the developed methods by applying them to a variety of data examples, concretely to multivariate inequality analysis, item impact and differential item functioning in item response theory and to the analysis of distributional differences in spatial statistics. The power of regularization is illustrated with a data example in the context of cognitive diagnosis models

    Concepts for Decision Making under Severe Uncertainty with Partial Ordinal and Partial Cardinal Preferences

    Get PDF
    We introduce three different approaches for decision making under uncertainty if (I) there is only partial (both cardinally and ordinally scaled) information on an agent’s preferences and (II) the uncertainty about the states of nature is described by a credal set (or some other imprecise probabilistic model). Particularly, situation (I) is modeled by a pair of binary relations, one specifying the partial rank order of the alternatives and the other modeling partial information on the strength of preference. Our first approach relies on decision criteria constructing complete rankings of the available acts that are based on generalized expectation intervals. Subsequently, we introduce different concepts of global admissibility that construct partial orders between the available acts by comparing them all simultaneously. Finally, we define criteria induced by suitable binary relations on the set of acts and, therefore, can be understood as concepts of local admissibility. For certain criteria, we provide linear programming based algorithms for checking optimality/admissibility of acts. Additionally, the paper includes a discussion of a prototypical situation by means of a toy example

    Detecting stochastic dominance for poset-valued random variables as an example of linear programming on closure systems

    Get PDF
    In this paper we develop a linear programming method for detecting stochastic dominance for random variables with values in a partially ordered set (poset) based on the upset-characterization of stochastic dominance. The proposed detection-procedure is based on a descriptively interpretable statistic, namely the maximal probability-difference of an upset. We show how our method is related to the general task of maximizing a linear function on a closure system. Since closure systems are describable via their valid formal implications, we can use here ingredients of formal concept analysis. We also address the question of inference via resampling and via conservative bounds given by the application of Vapnik-Chervonenkis theory, which also allows for an adequate pruning of the envisaged closure system that allows for the regularization of the test statistic (by paying a price of less conceptual rigor). We illustrate the developed methods by applying them to a variety of data examples, concretely to multivariate inequality analysis, item impact and differential item functioning in item response theory and to the analysis of distributional differences in spatial statistics. The power of regularization is illustrated with a data example in the context of cognitive diagnosis models

    A Probabilistic Evaluation Framework for Preference Aggregation Reflecting Group Homogeneity

    Get PDF
    Groups differ in the homogeneity of their members' preferences. Reflecting this, we propose a probabilistic criterion for evaluating and comparing the adequateness of preference aggregation procedures that takes into account information on the considered group's homogeneity structure. Further, we discuss two approaches for approximating our criterion if information is only imperfectly given and show how to estimate these approximations from data. As a preparation, we elaborate some general minimal requirements for measuring homogeneity and discuss a specific proposal for a homogeneity measure. Finally, we investigate our framework by comparing aggregation rules in a simulation study

    A simple descriptive method for multidimensional item response theory based on stochastic dominance

    Get PDF
    In this paper we develop a descriptive concept of a (partially) ordinal joint scaling of items and persons in the context of (dichotomous) item response analysis. The developed method has to be understood as a purely descriptive method describing relations among the data observed in a given item response data set, it is not intended to directly measure some presumed underlying latent traits. We establish a hierarchy of pairs of item difficulty and person ability orderings that empirically support each other. The ordering principles we use for the construction are essentially related to the concept of first order stochastic dominance. Our method is able to avoid a paradoxical result of multidimensional item response theory models described in \cite{hooker2009paradoxical}. We introduce our concepts in the language of formal concept analysis. This is due to the fact that our method has some similarities with formal concept analysis and knowledge space theory: Both our methods as well as descriptive techniques used in knowledge space theory (concretely, item tree analysis) could be seen as two different stochastic generalizations of formal implications from formal concept analysis

    A simple descriptive method for multidimensional item response theory based on stochastic dominance

    Get PDF
    In this paper we develop a descriptive concept of a (partially) ordinal joint scaling of items and persons in the context of (dichotomous) item response analysis. The developed method has to be understood as a purely descriptive method describing relations among the data observed in a given item response data set, it is not intended to directly measure some presumed underlying latent traits. We establish a hierarchy of pairs of item difficulty and person ability orderings that empirically support each other. The ordering principles we use for the construction are essentially related to the concept of first order stochastic dominance. Our method is able to avoid a paradoxical result of multidimensional item response theory models described in \cite{hooker2009paradoxical}. We introduce our concepts in the language of formal concept analysis. This is due to the fact that our method has some similarities with formal concept analysis and knowledge space theory: Both our methods as well as descriptive techniques used in knowledge space theory (concretely, item tree analysis) could be seen as two different stochastic generalizations of formal implications from formal concept analysis

    A Probabilistic Evaluation Framework for Preference Aggregation Reflecting Group Homogeneity

    Get PDF
    Groups differ in the homogeneity of their members' preferences. Reflecting this, we propose a probabilistic criterion for evaluating and comparing the adequateness of preference aggregation procedures that takes into account information on the considered group's homogeneity structure. Further, we discuss two approaches for approximating our criterion if information is only imperfectly given and show how to estimate these approximations from data. As a preparation, we elaborate some general minimal requirements for measuring homogeneity and discuss a specific proposal for a homogeneity measure. Finally, we investigate our framework by comparing aggregation rules in a simulation study
    • …
    corecore