187,510 research outputs found

    A command and data subsystem for deep space exploration based on the RCA 1802 microprocessor in a distributed configuration

    Get PDF
    The Command and Data Subsystem (CDS) is an RCA 1802microprocessor based subsystem that acts as the central nervous system for the Galileo Orbiter Spacecraft. All communication between the ground and spacecraft flows through the CDS. The CDS also distributes commands in real time, algorithmetrically expanded from a data base loaded from the ground and in response to spacecraft alarms. The distributed microprocessor system is configured as a redundant set of hardware with three microprocessors on each half. The microprocessors are surrounded by a group of special purpose hardware components which greatly enhance the ability of the software to perform its task. It is shown how the software architecture makes a distributed system of six microprocessors appear to each user as a single virtual machine, and collectively as a set of cooperating virtual machines that prevent the simultaneous presence of the several users from interfering destructively with each other

    Inferring the Rate-Length Law of Protein Folding

    Get PDF
    We investigate the rate-length scaling law of protein folding, a key undetermined scaling law in the analytical theory of protein folding. We demonstrate that chain length is a dominant factor determining folding times, and that the unambiguous determination of the way chain length corre- lates with folding times could provide key mechanistic insight into the folding process. Four specific proposed laws (power law, exponential, and two stretched exponentials) are tested against one an- other, and it is found that the power law best explains the data. At the same time, the fit power law results in rates that are very fast, nearly unreasonably so in a biological context. We show that any of the proposed forms are viable, conclude that more data is necessary to unequivocally infer the rate-length law, and that such data could be obtained through a small number of protein folding experiments on large protein domains

    Smooth, identifiable supermodels of discrete DAG models with latent variables

    Full text link
    We provide a parameterization of the discrete nested Markov model, which is a supermodel that approximates DAG models (Bayesian network models) with latent variables. Such models are widely used in causal inference and machine learning. We explicitly evaluate their dimension, show that they are curved exponential families of distributions, and fit them to data. The parameterization avoids the irregularities and unidentifiability of latent variable models. The parameters used are all fully identifiable and causally-interpretable quantities.Comment: 30 page
    • …
    corecore