1,180,105 research outputs found

    Insights into information contained in multiplicative scatter correction parameters and the potential for estimating particle size from these parameters

    Get PDF
    This paper investigates the nature of information contained in scatter correction parameters. The study had two objectives. The first objective was to examine the nature and extent of information contained in scatter correction parameters. The second objective is to examine whether this information can be effectively extracted by proposing a method to obtain particularly the mean particle diameter from the scatter correction parameters. By using a combination of experimental data and simulated data generated using fundamental light propagation theory, a deeper and more fundamental insight of what information is removed by the multiplicative scatter correction (MSC) method is obtained. It was found that the MSC parameters are strongly influenced not only by particle size but also by particle concentration as well as refractive index of the medium. The possibility of extracting particle size information in addition to particle concentration was considered by proposing a two-step method which was tested using a 2-component and 4-component data set. This method can in principle, be used in conjunction with any scatter correction technique provided that the scatter correction parameters exhibit a systematic dependence with respect to particle size and concentration. It was found that the approach which uses the MSC parameters gave a better estimate of the particle diameter compared to using partial least squares (PLS) regression for the 2-component data. For the 4 component data it was found that PLS regression gave better results but further examination indicated this was due to chance correlations of the particle diameter with the two of the absorbing species in the mixture

    Testing for Non-Normality in the Presence of One-Sided Slope Parameters

    Get PDF
    In a recent paper, Hughes (1999) showed that the power of tests of linear regression parameters could be improved by utilizing one-sided information regarding the nuisance parameters in the testing problem. In this paper, we extend this principle to the problem of diagnosing departures from the assumption of normality in linear regression residuals. We show that the asymptotic theory of the popular normality test developed by Jarque and Bera (1987) is also applicable when inequality constraints are imposed on the slope parameters. Monte Carlo evidence is then presented which suggests that the size of tests based on inequality constrained residuals is roughly equivalent to the size of tests based on unconstrained residuals using both asymptotic and bootstrap critical values. We then demonstrate that significant improvements in the power of the Jarque-Bera test can be made via the application of one-sided information concerning the slope parameters in the model.Jarque-Bera test; inequality constraints; power; bootstrap; Monte Carlo simulations

    Utility versus Income-Based Altruism

    Get PDF
    In Dictator Game experiments where the information status of the participants varies we find that a certain type of proposer tends to reduce his offers when the recipient has incomplete information about the pie size. We also find that a certain type of recipient tends to reject too small offers in the Impunity Game when the proposer has incomplete information about the recipient type. To explain these puzzling results we reconsider Becker's [1974] theory of altruism, which assumes that externalities are caused by other people's utility. When incomplete information about the other person is introduced, it turns out that his approach predicts – in contrast to other theories of altruism - that some altruistic persons will change their behavior as observed in our experiments. Thus, a kind of utility based altruism (and spite as its opposite form) can be assumed as the main principle governing behavior in this class of games. --

    Protein Signaling Networks from Single Cell Fluctuations and Information Theory Profiling

    Get PDF
    Protein signaling networks among cells play critical roles in a host of pathophysiological processes, from inflammation to tumorigenesis. We report on an approach that integrates microfluidic cell handling, in situ protein secretion profiling, and information theory to determine an extracellular protein-signaling network and the role of perturbations. We assayed 12 proteins secreted from human macrophages that were subjected to lipopolysaccharide challenge, which emulates the macrophage-based innate immune responses against Gram-negative bacteria. We characterize the fluctuations in protein secretion of single cells, and of small cell colonies (n = 2, 3,···), as a function of colony size. Measuring the fluctuations permits a validation of the conditions required for the application of a quantitative version of the Le Chatelier's principle, as derived using information theory. This principle provides a quantitative prediction of the role of perturbations and allows a characterization of a protein-protein interaction network

    Raids and Imitation

    Get PDF
    Many job changes occur without intervening spells of unemployment.A model is constructed in an attempt to understand this phenomenon. It implies that the best workers are hired away first because, with imperfect information, prices do not fully adjust for quality. Thus, there develops stigma associated with failing to receive outside offers. The force of the stigma,which affects wages, depends upon the likelihood of discovering a worker's ability, the size of the market, and the speed of diffusion of information. In some occupations, it implies that there quickly develop pronounced differ-ences in the treatment of raided and unraided workers. A consequenceis a theory of occupational wage dispersion. The Peter Principle-—that workers are promoted to a level of incompetence-is a direct implication.The model can be applied to product markets as well to explain the relationship between price and time on the shelf.

    Classical Equilibrium Thermostatistics, "Sancta sanctorum of Statistical Mechanics", From Nuclei to Stars

    Full text link
    Equilibrium statistics of Hamiltonian systems is correctly described by the microcanonical ensemble. Classically this is the manifold of all points in the N-body phase space with the given total energy. Due to Boltzmann-Planck's principle, e^S=tr(\delta(E-H)), its geometrical size is related to the entropy S(E,N,V,...). This definition does not invoke any information theory, no thermodynamic limit, no extensivity, and no homogeneity assumption. Therefore, it describes the equilibrium statistics of extensive as well of non-extensive systems. Due to this fact it is the fundamental definition of any classical equilibrium statistics. It addresses nuclei and astrophysical objects as well. S(E,N,V,...) is multiply differentiable everywhere, even at phase-transitions. All kind of phase transitions can be distinguished harply and uniquely for even small systems. What is even more important, in contrast to the canonical theory, also the region of phase-space which corresponds to phase-separation is accessible, where the most interesting phenomena occur. No deformed q-entropy is needed for equilibrium. Boltzmann-Planck is the only appropriate statistics independent of whether the system is small or large, whether the system is ruled by short or long range forces.Comment: Invited paper for NEXT2003, 10pages, 6 figures Reference 1 correcte

    Informational black holes in financial markets

    Get PDF
    We study how efficient primary financial markets are in allocating capital when information about investment opportunities is dispersed across market participants. Paradoxically, the very fact that information is valuable for making real investment decisions destroys the efficiency of the market. To add to the paradox, as the number of market participants with useful information increases a growing share of them fall into an "informational black hole," making markets even less efficient. Contrary to the predictions of standard theory, social surplus and the revenues of an entrepreneur seeking financing can be decreasing in the size of the financial market, the linkage principle of Milgrom and Weber (1982) may not hold, and collusion among investors may enhance efficiency
    corecore