925 research outputs found

    Encryption of Covert Information into Multiple Statistical Distributions

    Full text link
    A novel strategy to encrypt covert information (code) via unitary projections into the null spaces of ill-conditioned eigenstructures of multiple host statistical distributions, inferred from incomplete constraints, is presented. The host pdf's are inferred using the maximum entropy principle. The projection of the covert information is dependent upon the pdf's of the host statistical distributions. The security of the encryption/decryption strategy is based on the extreme instability of the encoding process. A self-consistent procedure to derive keys for both symmetric and asymmetric cryptography is presented. The advantages of using a multiple pdf model to achieve encryption of covert information are briefly highlighted. Numerical simulations exemplify the efficacy of the model.Comment: 18 pages, 4 figures. Three sentences expanded to emphasize detail. Typos correcte

    The PLC: a logical development

    Get PDF
    Programmable Logic Controllers (PLCs) have been used to control industrial processes and equipment for over 40 years, having their first commercially recognised application in 1969. Since then there have been enormous changes in the design and application of PLCs, yet developments were evolutionary rather than radical. The flexibility of the PLC does not confine it to industrial use and it has been used for disparate non-industrial control applications . This article reviews the history, development and industrial applications of the PLC

    T violation and the unidirectionality of time

    Get PDF
    An increasing number of experiments at the Belle, BNL, CERN, DA{\Phi}NE and SLAC accelerators are confirming the violation of time reversal invariance (T). The violation signifies a fundamental asymmetry between the past and future and calls for a major shift in the way we think about time. Here we show that processes which violate T symmetry induce destructive interference between different paths that the universe can take through time. The interference eliminates all paths except for two that represent continuously forwards and continuously backwards time evolution. Evidence from the accelerator experiments indicates which path the universe is effectively following. This work may provide fresh insight into the long-standing problem of modeling the dynamics of T violation processes. It suggests that T violation has previously unknown, large-scale physical effects and that these effects underlie the origin of the unidirectionality of time. It may have implications for the Wheeler-DeWitt equation of canonical quantum gravity. Finally it provides a view of the quantum nature of time itself.Comment: 24 pages, 5 figures. Final version accepted for publishing in Foundations of Physics. The final publication is available at http://www.springerlink.com/content/y3h4174jw2w78322

    Brane Interaction as the Origin of Inflation

    Full text link
    We reanalyze brane inflation with brane-brane interactions at an angle, which include the special case of brane-anti-brane interaction. If nature is described by a stringy realization of the brane world scenario today (with arbitrary compactification), and if some additional branes were present in the early universe, we find that an inflationary epoch is generically quite natural, ending with a big bang when the last branes collide. In an interesting brane inflationary scenario suggested by generic string model-building, we use the density perturbation observed in the cosmic microwave background and the coupling unification to find that the string scale is comparable to the GUT scale.Comment: 28 pages, 8 figures, 2 tables, JHEP forma

    Does a computer have an arrow of time?

    Get PDF
    In [Sch05a], it is argued that Boltzmann's intuition, that the psychological arrow of time is necessarily aligned with the thermodynamic arrow, is correct. Schulman gives an explicit physical mechanism for this connection, based on the brain being representable as a computer, together with certain thermodynamic properties of computational processes. [Haw94] presents similar, if briefer, arguments. The purpose of this paper is to critically examine the support for the link between thermodynamics and an arrow of time for computers. The principal arguments put forward by Schulman and Hawking will be shown to fail. It will be shown that any computational process that can take place in an entropy increasing universe, can equally take place in an entropy decreasing universe. This conclusion does not automatically imply a psychological arrow can run counter to the thermodynamic arrow. Some alternative possible explanations for the alignment of the two arrows will be briefly discussed

    G\"odel Incompleteness and the Black Hole Information Paradox

    Full text link
    Semiclassical reasoning suggests that the process by which an object collapses into a black hole and then evaporates by emitting Hawking radiation may destroy information, a problem often referred to as the black hole information paradox. Further, there seems to be no unique prediction of where the information about the collapsing body is localized. We propose that the latter aspect of the paradox may be a manifestation of an inconsistent self-reference in the semiclassical theory of black hole evolution. This suggests the inadequacy of the semiclassical approach or, at worst, that standard quantum mechanics and general relavity are fundamentally incompatible. One option for the resolution for the paradox in the localization is to identify the G\"odel-like incompleteness that corresponds to an imposition of consistency, and introduce possibly new physics that supplies this incompleteness. Another option is to modify the theory in such a way as to prohibit self-reference. We discuss various possible scenarios to implement these options, including eternally collapsing objects, black hole remnants, black hole final states, and simple variants of semiclassical quantum gravity.Comment: 14 pages, 2 figures; revised according to journal requirement

    Breakdown of the Landauer bound for information erasure in the quantum regime

    Full text link
    A known aspect of the Clausius inequality is that an equilibrium system subjected to a squeezing \d S of its entropy must release at least an amount |\dbarrm Q|=T|\d S| of heat. This serves as a basis for the Landauer principle, which puts a lower bound Tln2T\ln 2 for the heat generated by erasure of one bit of information. Here we show that in the world of quantum entanglement this law is broken. A quantum Brownian particle interacting with its thermal bath can either generate less heat or even {\it adsorb} heat during an analogous squeezing process, due to entanglement with the bath. The effect exists even for weak but fixed coupling with the bath, provided that temperature is low enough. This invalidates the Landauer bound in the quantum regime, and suggests that quantum carriers of information can be much more efficient than assumed so far.Comment: 13 pages, revtex, 2 eps figure

    Equation of state for Universe from similarity symmetries

    Full text link
    In this paper we proposed to use the group of analysis of symmetries of the dynamical system to describe the evolution of the Universe. This methods is used in searching for the unknown equation of state. It is shown that group of symmetries enforce the form of the equation of state for noninteracting scaling multifluids. We showed that symmetries give rise the equation of state in the form p=Λ+w1ρ(a)+w2aβ+0p=-\Lambda+w_{1}\rho(a)+w_{2}a^{\beta}+0 and energy density ρ=Λ+ρ01a3(1+w)+ρ02aβ+ρ03a3\rho=\Lambda+\rho_{01}a^{-3(1+w)}+\rho_{02}a^{\beta}+\rho_{03}a^{-3}, which is commonly used in cosmology. The FRW model filled with scaling fluid (called homological) is confronted with the observations of distant type Ia supernovae. We found the class of model parameters admissible by the statistical analysis of SNIa data. We showed that the model with scaling fluid fits well to supernovae data. We found that Ωm,00.4\Omega_{\text{m},0} \simeq 0.4 and n1n \simeq -1 (β=3n\beta = -3n), which can correspond to (hyper) phantom fluid, and to a high density universe. However if we assume prior that Ωm,0=0.3\Omega_{\text{m},0}=0.3 then the favoured model is close to concordance Λ\LambdaCDM model. Our results predict that in the considered model with scaling fluids distant type Ia supernovae should be brighter than in Λ\LambdaCDM model, while intermediate distant SNIa should be fainter than in Λ\LambdaCDM model. We also investigate whether the model with scaling fluid is actually preferred by data over Λ\LambdaCDM model. As a result we find from the Akaike model selection criterion prefers the model with noninteracting scaling fluid.Comment: accepted for publication versio

    Effects of inhomogeneities on apparent cosmological observables: "fake" evolving dark energy

    Full text link
    Using the exact Lemaitre-Bondi-Tolman solution with a non-vanishing cosmological constant Λ\Lambda, we investigate how the presence of a local spherically-symmetric inhomogeneity can affect apparent cosmological observables, such as the deceleration parameter or the effective equation of state of dark energy (DE), derived from the luminosity distance under the assumption that the real space-time is exactly homogeneous and isotropic. The presence of a local underdensity is found to produce apparent phantom behavior of DE, while a locally overdense region leads to apparent quintessence behavior. We consider relatively small large scale inhomogeneities which today are not linear and could be seeded by primordial curvature perturbations compatible with CMB bounds. Our study shows how observations in an inhomogeneous Λ\LambdaCDM universe with initial conditions compatible with the inflationary beginning, if interpreted under the wrong assumption of homogeneity, can lead to the wrong conclusion about the presence of "fake" evolving dark energy instead of Λ\Lambda.Comment: 22 pages, 19 figures,Final version to appear in European Physical Journal

    Emotion, Meaning, and Appraisal Theory

    Get PDF
    According to psychological emotion theories referred to as appraisal theory, emotions are caused by appraisals (evaluative judgments). Borrowing a term from Jan Smedslund, it is the contention of this article that psychological appraisal theory is “pseudoempirical” (i.e., misleadingly or incorrectly empirical). In the article I outline what makes some scientific psychology “pseudoempirical,” distinguish my view on this from Jan Smedslund’s, and then go on to show why paying heed to the ordinary meanings of emotion terms is relevant to psychology, and how appraisal theory is methodologically off the mark by employing experiments, questionnaires, and the like, to investigate what follows from the ordinary meanings of words. The overarching argument of the article is that the scientific research program of appraisal theory is fundamentally misguided and that a more philosophical approach is needed to address the kinds of questions it seeks to answer
    corecore