3,292 research outputs found

    On CSP and the Algebraic Theory of Effects

    Full text link
    We consider CSP from the point of view of the algebraic theory of effects, which classifies operations as effect constructors or effect deconstructors; it also provides a link with functional programming, being a refinement of Moggi's seminal monadic point of view. There is a natural algebraic theory of the constructors whose free algebra functor is Moggi's monad; we illustrate this by characterising free and initial algebras in terms of two versions of the stable failures model of CSP, one more general than the other. Deconstructors are dealt with as homomorphisms to (possibly non-free) algebras. One can view CSP's action and choice operators as constructors and the rest, such as concealment and concurrency, as deconstructors. Carrying this programme out results in taking deterministic external choice as constructor rather than general external choice. However, binary deconstructors, such as the CSP concurrency operator, provide unresolved difficulties. We conclude by presenting a combination of CSP with Moggi's computational {\lambda}-calculus, in which the operators, including concurrency, are polymorphic. While the paper mainly concerns CSP, it ought to be possible to carry over similar ideas to other process calculi

    Atmospheric emissions from the deepwater Horizon spill constrain air-water partitioning, hydrocarbon fate, and leak rate

    Get PDF
    The fate of deepwater releases of gas and oil mixtures is initially determined by solubility and volatility of individual hydrocarbon species; these attributes determine partitioning between air and water. Quantifying this partitioning is necessary to constrain simulations of gas and oil transport, to predict marine bioavailability of different fractions of the gas-oil mixture, and to develop a comprehensive picture of the fate of leaked hydrocarbons in the marine environment. Analysis of airborne atmospheric data shows massive amounts (∌258,000 kg/day) of hydrocarbons evaporating promptly from the Deepwater Horizon spill; these data collected during two research flights constrain air-water partitioning, thus bioavailability and fate, of the leaked fluid. This analysis quantifies the fraction of surfacing hydrocarbons that dissolves in the water column (∌33% by mass), the fraction that does not dissolve, and the fraction that evaporates promptly after surfacing (∌14% by mass). We do not quantify the leaked fraction lacking a surface expression; therefore, calculation of atmospheric mass fluxes provides a lower limit to the total hydrocarbon leak rate of 32,600 to 47,700 barrels of fluid per day, depending on reservoir fluid composition information. This study demonstrates a new approach for rapid-response airborne assessment of future oil spills. Copyright 2011 by the American Geophysical Union

    Public Evidence from Secret Ballots

    Full text link
    Elections seem simple---aren't they just counting? But they have a unique, challenging combination of security and privacy requirements. The stakes are high; the context is adversarial; the electorate needs to be convinced that the results are correct; and the secrecy of the ballot must be ensured. And they have practical constraints: time is of the essence, and voting systems need to be affordable and maintainable, and usable by voters, election officials, and pollworkers. It is thus not surprising that voting is a rich research area spanning theory, applied cryptography, practical systems analysis, usable security, and statistics. Election integrity involves two key concepts: convincing evidence that outcomes are correct and privacy, which amounts to convincing assurance that there is no evidence about how any given person voted. These are obviously in tension. We examine how current systems walk this tightrope.Comment: To appear in E-Vote-Id '1

    Reduction in young male suicide in Scotland

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Rates of suicide and undetermined death increased rapidly in Scotland in the 1980's and 1990's. The largest increases were in men, with a marked increase in rates in younger age groups. This was associated with an increase in hanging as a method of suicide. National suicide prevention work has identified young men as a priority group. Routinely collected national information suggested a decrease in suicide rates in younger men at the beginning of the 21<sup>st </sup>century. This study tested whether this was a significant change in trend, and whether it was associated with any change in hanging rates in young men.</p> <p>Methods</p> <p>Joinpoint regression was used to estimate annual percentage changes in age-specific rates of suicide and undetermined intent death, and to identify times when the trends changed significantly. Rates of deaths by method in 15 – 29 year old males and females were also examined to assess whether there had been any significant changes in method use in this age group.</p> <p>Results</p> <p>There was a 42% reduction in rates in 15 – 29 year old men, from 42.5/100,000 in 2000 to 24.5/100,000 in 2004. A joinpoint analysis confirmed that this was a significant change. There was also a significant change in trend in hanging in men in this age group, with a reduction in rates after 2000. No other male age group showed a significant change in trend over the period 1980 – 2004. There was a smaller reduction in suicide rates in women in the 15 – 29 year old age group, with a reduction in hanging from 2002.</p> <p>Conclusion</p> <p>There has been a reduction in suicide rates in men aged 15 – 29 years, and this is associated with a significant reduction in deaths by hanging in this age group. It is not clear whether this is related to a change in method preference, or an overall reduction in suicidal behaviour, and review of self-harm data will be required to investigate this further.</p

    The glyoxal budget and its contribution to organic aerosol for Los Angeles, California, during CalNex 2010

    Get PDF
    Recent laboratory and field studies have indicated that glyoxal is a potentially large contributor to secondary organic aerosol mass. We present in situ glyoxal measurements acquired with a recently developed, high sensitivity spectroscopic instrument during the CalNex 2010 field campaign in Pasadena, California. We use three methods to quantify the production and loss of glyoxal in Los Angeles and its contribution to organic aerosol. First, we calculate the difference between steady state sources and sinks of glyoxal at the Pasadena site, assuming that the remainder is available for aerosol uptake. Second, we use the Master Chemical Mechanism to construct a two-dimensional model for gas-phase glyoxal chemistry in Los Angeles, assuming that the difference between the modeled and measured glyoxal concentration is available for aerosol uptake. Third, we examine the nighttime loss of glyoxal in the absence of its photochemical sources and sinks. Using these methods we constrain the glyoxal loss to aerosol to be 0-5 × 10-5 s-1 during clear days and (1 ± 0.3) × 10-5 s-1 at night. Between 07:00-15:00 local time, the diurnally averaged secondary organic aerosol mass increases from 3.2 ÎŒg m-3 to a maximum of 8.8 ÎŒg m -3. The constraints on the glyoxal budget from this analysis indicate that it contributes 0-0.2 ÎŒg m-3 or 0-4% of the secondary organic aerosol mass. Copyright 2011 by the American Geophysical Union

    PocketMatch: A new algorithm to compare binding sites in protein structures

    Get PDF
    Background: Recognizing similarities and deriving relationships among protein molecules is a fundamental&#xd;&#xa;requirement in present-day biology. Similarities can be present at various levels which can be detected through comparison of protein sequences or their structural folds. In some cases similarities obscure at these levels could be present merely in the substructures at their binding sites. Inferring functional similarities between protein molecules by comparing their binding sites is still largely exploratory and not as yet a routine protocol. One of&#xd;&#xa;the main reasons for this is the limitation in the choice of appropriate analytical tools that can compare binding sites with high sensitivity. To benefit from the enormous amount of structural data that is being rapidly accumulated, it is essential to have high throughput tools that enable large scale binding site comparison.&#xd;&#xa;&#xd;&#xa;Results: Here we present a new algorithm PocketMatch for comparison of binding sites in a frame invariant&#xd;&#xa;manner. Each binding site is represented by 90 lists of sorted distances capturing shape and chemical nature of the site. The sorted arrays are then aligned using an incremental alignment method and scored to obtain PMScores for pairs of sites. A comprehensive sensitivity analysis and an extensive validation of the algorithm have been carried out. Perturbation studies where the geometry of a given site was retained but the residue types were changed randomly, indicated that chance similarities were virtually non-existent. Our analysis also demonstrates that shape information alone is insufficient to discriminate between diverse binding sites, unless&#xd;&#xa;combined with chemical nature of amino acids.&#xd;&#xa;&#xd;&#xa;Conclusions: A new algorithm has been developed to compare binding sites in accurate, efficient and&#xd;&#xa;high-throughput manner. Though the representation used is conceptually simplistic, we demonstrate that along&#xd;&#xa;with the new alignment strategy used, it is sufficient to enable binding comparison with high sensitivity. Novel methodology has also been presented for validating the algorithm for accuracy and sensitivity with respect to geometry and chemical nature of the site. The method is also fast and takes about 1/250th second for one comparison on a single processor. A parallel version on BlueGene has also been implemented

    Toward High-Precision Measures of Large-Scale Structure

    Get PDF
    I review some results of estimation of the power spectrum of density fluctuations from galaxy redshift surveys and discuss advances that may be possible with the Sloan Digital Sky Survey. I then examine the realities of power spectrum estimation in the presence of Galactic extinction, photometric errors, galaxy evolution, clustering evolution, and uncertainty about the background cosmology.Comment: 24 pages, including 11 postscript figures. Uses crckapb.sty (included in submission). To appear in ``Ringberg Workshop on Large-Scale Structure,'' ed D. Hamilton (Kluwer, Amsterdam), p. 39

    CMASA: an accurate algorithm for detecting local protein structural similarity and its application to enzyme catalytic site annotation

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The rapid development of structural genomics has resulted in many "unknown function" proteins being deposited in Protein Data Bank (PDB), thus, the functional prediction of these proteins has become a challenge for structural bioinformatics. Several sequence-based and structure-based methods have been developed to predict protein function, but these methods need to be improved further, such as, enhancing the accuracy, sensitivity, and the computational speed. Here, an accurate algorithm, the CMASA (Contact MAtrix based local Structural Alignment algorithm), has been developed to predict unknown functions of proteins based on the local protein structural similarity. This algorithm has been evaluated by building a test set including 164 enzyme families, and also been compared to other methods.</p> <p>Results</p> <p>The evaluation of CMASA shows that the CMASA is highly accurate (0.96), sensitive (0.86), and fast enough to be used in the large-scale functional annotation. Comparing to both sequence-based and global structure-based methods, not only the CMASA can find remote homologous proteins, but also can find the active site convergence. Comparing to other local structure comparison-based methods, the CMASA can obtain the better performance than both FFF (a method using geometry to predict protein function) and SPASM (a local structure alignment method); and the CMASA is more sensitive than PINTS and is more accurate than JESS (both are local structure alignment methods). The CMASA was applied to annotate the enzyme catalytic sites of the non-redundant PDB, and at least 166 putative catalytic sites have been suggested, these sites can not be observed by the Catalytic Site Atlas (CSA).</p> <p>Conclusions</p> <p>The CMASA is an accurate algorithm for detecting local protein structural similarity, and it holds several advantages in predicting enzyme active sites. The CMASA can be used in large-scale enzyme active site annotation. The CMASA can be available by the mail-based server (<url>http://159.226.149.45/other1/CMASA/CMASA.htm</url>).</p

    Discriminative structural approaches for enzyme active-site prediction

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Predicting enzyme active-sites in proteins is an important issue not only for protein sciences but also for a variety of practical applications such as drug design. Because enzyme reaction mechanisms are based on the local structures of enzyme active-sites, various template-based methods that compare local structures in proteins have been developed to date. In comparing such local sites, a simple measurement, RMSD, has been used so far.</p> <p>Results</p> <p>This paper introduces new machine learning algorithms that refine the similarity/deviation for comparison of local structures. The similarity/deviation is applied to two types of applications, single template analysis and multiple template analysis. In the single template analysis, a single template is used as a query to search proteins for active sites, whereas a protein structure is examined as a query to discover the possible active-sites using a set of templates in the multiple template analysis.</p> <p>Conclusions</p> <p>This paper experimentally illustrates that the machine learning algorithms effectively improve the similarity/deviation measurements for both the analyses.</p
    • 

    corecore