337 research outputs found

    Networked buffering: a basic mechanism for distributed robustness in complex adaptive systems

    Get PDF
    A generic mechanism - networked buffering - is proposed for the generation of robust traits in complex systems. It requires two basic conditions to be satisfied: 1) agents are versatile enough to perform more than one single functional role within a system and 2) agents are degenerate, i.e. there exists partial overlap in the functional capabilities of agents. Given these prerequisites, degenerate systems can readily produce a distributed systemic response to local perturbations. Reciprocally, excess resources related to a single function can indirectly support multiple unrelated functions within a degenerate system. In models of genome:proteome mappings for which localized decision-making and modularity of genetic functions are assumed, we verify that such distributed compensatory effects cause enhanced robustness of system traits. The conditions needed for networked buffering to occur are neither demanding nor rare, supporting the conjecture that degeneracy may fundamentally underpin distributed robustness within several biotic and abiotic systems. For instance, networked buffering offers new insights into systems engineering and planning activities that occur under high uncertainty. It may also help explain recent developments in understanding the origins of resilience within complex ecosystems. \ud \u

    A Formalism for the Systematic Treatment of Rapidity Logarithms in Quantum Field Theory

    Get PDF
    Many observables in QCD rely upon the resummation of perturbation theory to retain predictive power. Resummation follows after one factorizes the cross section into the rele- vant modes. The class of observables which are sensitive to soft recoil effects are particularly challenging to factorize and resum since they involve rapidity logarithms. In this paper we will present a formalism which allows one to factorize and resum the perturbative series for such observables in a systematic fashion through the notion of a "rapidity renormalization group". That is, a Collin-Soper like equation is realized as a renormalization group equation, but has a more universal applicability to observables beyond the traditional transverse momentum dependent parton distribution functions (TMDPDFs) and the Sudakov form factor. This formalism has the feature that it allows one to track the (non-standard) scheme dependence which is inherent in any scenario where one performs a resummation of rapidity divergences. We present a pedagogical introduction to the formalism by applying it to the well-known massive Sudakov form factor. The formalism is then used to study observables of current interest. A factorization theorem for the transverse momentum distribution of Higgs production is presented along with the result for the resummed cross section at NLL. Our formalism allows one to define gauge invariant TMDPDFs which are independent of both the hard scattering amplitude and the soft function, i.e. they are uni- versal. We present details of the factorization and resummation of the jet broadening cross section including a renormalization in pT space. We furthermore show how to regulate and renormalize exclusive processes which are plagued by endpoint singularities in such a way as to allow for a consistent resummation.Comment: Typos in Appendix C corrected, as well as a typo in eq. 5.6

    Using a logical model to predict the growth of yeast

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>A logical model of the known metabolic processes in <it>S. cerevisiae </it>was constructed from iFF708, an existing Flux Balance Analysis (FBA) model, and augmented with information from the KEGG online pathway database. The use of predicate logic as the knowledge representation for modelling enables an explicit representation of the structure of the metabolic network, and enables logical inference techniques to be used for model identification/improvement.</p> <p>Results</p> <p>Compared to the FBA model, the logical model has information on an additional 263 putative genes and 247 additional reactions. The correctness of this model was evaluated by comparison with iND750 (an updated FBA model closely related to iFF708) by evaluating the performance of both models on predicting empirical minimal medium growth data/essential gene listings.</p> <p>Conclusion</p> <p>ROC analysis and other statistical studies revealed that use of the simpler logical form and larger coverage results in no significant degradation of performance compared to iND750.</p

    Composite Higgs Search at the LHC

    Full text link
    The Higgs boson production cross-sections and decay rates depend, within the Standard Model (SM), on a single unknown parameter, the Higgs mass. In composite Higgs models where the Higgs boson emerges as a pseudo-Goldstone boson from a strongly-interacting sector, additional parameters control the Higgs properties which then deviate from the SM ones. These deviations modify the LEP and Tevatron exclusion bounds and significantly affect the searches for the Higgs boson at the LHC. In some cases, all the Higgs couplings are reduced, which results in deterioration of the Higgs searches but the deviations of the Higgs couplings can also allow for an enhancement of the gluon-fusion production channel, leading to higher statistical significances. The search in the H to gamma gamma channel can also be substantially improved due to an enhancement of the branching fraction for the decay of the Higgs boson into a pair of photons.Comment: 32 pages, 16 figure

    Network analyses in systems biology: new strategies for dealing with biological complexity

    Get PDF
    The increasing application of network models to interpret biological systems raises a number of important methodological and epistemological questions. What novel insights can network analysis provide in biology? Are network approaches an extension of or in conflict with mechanistic research strategies? When and how can network and mechanistic approaches interact in productive ways? In this paper we address these questions by focusing on how biological networks are represented and analyzed in a diverse class of case studies. Our examples span from the investigation of organizational properties of biological networks using tools from graph theory to the application of dynamical systems theory to understand the behavior of complex biological systems. We show how network approaches support and extend traditional mechanistic strategies but also offer novel strategies for dealing with biological complexity

    Most Networks in Wagner's Model Are Cycling

    Get PDF
    In this paper we study a model of gene networks introduced by Andreas Wagner in the 1990s that has been used extensively to study the evolution of mutational robustness. We investigate a range of model features and parameters and evaluate the extent to which they influence the probability that a random gene network will produce a fixed point steady state expression pattern. There are many different types of models used in the literature, (discrete/continuous, sparse/dense, small/large network) and we attempt to put some order into this diversity, motivated by the fact that many properties are qualitatively the same in all the models. Our main result is that random networks in all models give rise to cyclic behavior more often than fixed points. And although periodic orbits seem to dominate network dynamics, they are usually considered unstable and not allowed to survive in previous evolutionary studies. Defining stability as the probability of fixed points, we show that the stability distribution of these networks is highly robust to changes in its parameters. We also find sparser networks to be more stable, which may help to explain why they seem to be favored by evolution. We have unified several disconnected previous studies of this class of models under the framework of stability, in a way that had not been systematically explored before

    Stable Isotopic Evidence for Methane Seeps in Neoproterozoic Postglacial Cap Carbonates

    Get PDF
    The Earth's most severe glaciations are thought to have occurred about 600 million years ago, in the late Neoproterozoic era. A puzzling feature of glacial deposits from this interval is that they are overlain by 1–5-m-thick 'cap carbonates' (particulate deep-water marine carbonate rocks) associated with a prominent negative carbon isotope excursion. Cap carbonates have been controversially ascribed to the aftermath of almost complete shutdown of the ocean ecosystems for millions of years during such ice ages—the 'snowball Earth' hypothesis. Conversely, it has also been suggested that these carbonate rocks were the result of destabilization of methane hydrates during deglaciation and concomitant flooding of continental shelves and interior basins. The most compelling criticism of the latter 'methane hydrate' hypothesis has been the apparent lack of extreme isotopic variation in cap carbonates inferred locally to be associated with methane seeps. Here we report carbon isotopic and petrographic data from a Neoproterozoic postglacial cap carbonate in south China that provide direct evidence for methane-influenced processes during deglaciation. This evidence lends strong support to the hypothesis that methane hydrate destabilization contributed to the enigmatic cap carbonate deposition and strongly negative carbon isotopic anomalies following Neoproterozoic ice ages. This explanation requires less extreme environmental disturbance than that implied by the snowball Earth hypothesis

    FinTech revolution: the impact of management information systems upon relative firm value and risk

    Get PDF
    The FinTech or ‘financial technology’ revolution has been gaining increasing interest as technologies are fundamentally changing the business of financial services. Consequently, financial technology is playing an increasingly important role in providing relative performance growth to firms. It is also well known that such relative performance can be observed through pairs trading investment. Therefore pairs trading have implications for understanding financial technology performance, yet the relationships between relative firm value and financial technology are not well understood. In this paper we investigate the impact of financial technology upon relative firm value in the banking sector. Firstly, using pairs trade data we show that financial technologies reveal differences in relative operational performance of firms, providing insight on the value of financial technologies. Secondly, we find that contribution of relative firm value growth from financial technologies is dependent on the specific business characteristics of the technology, such as the business application and activity type. Finally, we show that financial technologies impact the operational risk of firms and so firms need to take into account both the value and risk benefits in implementing new technological innovations. This paper will be of interest to academics and industry professionals

    Methicillin-resistant Staphylococcus aureus (MRSA) in rehabilitation and chronic-care-facilities: what is the best strategy?

    Get PDF
    BACKGROUND: The risk associated with methicillin-resistant Staphylococcus aureus (MRSA) has been decreasing for several years in intensive care departments, but is now increasing in rehabilitation and chronic-care-facilities (R-CCF). The aim of this study was to use published data and our own experience to discuss the roles of screening for MRSA carriers, the type of isolation to be implemented and the efficiency of chemical decolonization. DISCUSSION: Screening identifies over 90% of patients colonised with MRSA upon admission to R-CCF versus only 50% for intensive care units. Only totally dependent patients acquire MRSA. Thus, strict geographical isolation, as opposed to "social reinsertion", is clearly of no value. However, this should not lead to the abandoning of isolation, which remains essential during the administration of care. The use of chemicals to decolonize the nose and healthy skin appeared to be of some value and the application of this procedure could make technical isolation unnecessary in a non-negligible proportion of cases. SUMMARY: Given the increase in morbidity associated with MRSA observed in numerous hospitals, the emergence of a community-acquired disease associated with these strains and the evolution of glycopeptide-resistant strains, the voluntary application of a strategy combining screening, technical isolation and chemical decolonization in R-CCF appears to be an urgent matter of priority
    • …
    corecore