368 research outputs found

    SBML models and MathSBML

    Get PDF
    MathSBML is an open-source, freely-downloadable Mathematica package that facilitates working with Systems Biology Markup Language (SBML) models. SBML is a toolneutral,computer-readable format for representing models of biochemical reaction networks, applicable to metabolic networks, cell-signaling pathways, genomic regulatory networks, and other modeling problems in systems biology that is widely supported by the systems biology community. SBML is based on XML, a standard medium for representing and transporting data that is widely supported on the internet as well as in computational biology and bioinformatics. Because SBML is tool-independent, it enables model transportability, reuse, publication and survival. In addition to MathSBML, a number of other tools that support SBML model examination and manipulation are provided on the sbml.org website, including libSBML, a C/C++ library for reading SBML models; an SBML Toolbox for MatLab; file conversion programs; an SBML model validator and visualizer; and SBML specifications and schemas. MathSBML enables SBML file import to and export from Mathematica as well as providing an API for model manipulation and simulation

    Good Learning and Implicit Model Enumeration

    Get PDF
    MathSBML is an open-source, freely-downloadable Mathematica package that facilitates working with Systems Biology Markup Language (SBML) models. SBML is a toolneutral,computer-readable format for representing models of biochemical reaction networks, applicable to metabolic networks, cell-signaling pathways, genomic regulatory networks, and other modeling problems in systems biology that is widely supported by the systems biology community. SBML is based on XML, a standard medium for representing and transporting data that is widely supported on the internet as well as in computational biology and bioinformatics. Because SBML is tool-independent, it enables model transportability, reuse, publication and survival. In addition to MathSBML, a number of other tools that support SBML model examination and manipulation are provided on the sbml.org website, including libSBML, a C/C++ library for reading SBML models; an SBML Toolbox for MatLab; file conversion programs; an SBML model validator and visualizer; and SBML specifications and schemas. MathSBML enables SBML file import to and export from Mathematica as well as providing an API for model manipulation and simulation

    High-Redshift Cosmography

    Get PDF
    We constrain the parameters describing the kinematical state of the universe using a cosmographic approach, which is fundamental in that it requires a very minimal set of assumptions (namely to specify a metric) and does not rely on the dynamical equations for gravity. On the data side, we consider the most recent compilations of Supernovae and Gamma Ray Bursts catalogues. This allows to further extend the cosmographic fit up to z=6.6z = 6.6, i.e. up to redshift for which one could start to resolve the low z degeneracy among competing cosmological models. In order to reliably control the cosmographic approach at high redshifts, we adopt the expansion in the improved parameter y=z/(1+z)y = z/(1+z). This series has the great advantage to hold also for z>1z > 1 and hence it is the appropriate tool for handling data including non-nearby distance indicators. We find that Gamma Ray Bursts, probing higher redshifts than Supernovae, have constraining power and do require (and statistically allow) a cosmographic expansion at higher order than Supernovae alone. Exploiting the set of data from Union and GRBs catalogues, we show (for the first time in a purely cosmographic approach parametrized by deceleration q0q_0, jerk j0j_0, snap s0s_0) a definitively negative deceleration parameter q0q_0 up to the 3σ\sigma confidence level. We present also forecasts for realistic data sets that are likely to be obtained in the next few years.Comment: 16 pages, 6 figures, 3 tables. Improved version matching the published one, additional comments and reference

    Jet disc coupling in black hole binaries

    Full text link
    In the last decade multi-wavelength observations have demonstrated the importance of jets in the energy output of accreting black hole binaries. The observed correlations between the presence of a jet and the state of the accretion flow provide important information on the coupling between accretion and ejection processes. After a brief review of the properties of black hole binaries, I illustrate the connection between accretion and ejection through two particularly interesting examples. First, an INTEGRAL observation of Cygnus X-1 during a 'mini-' state transition reveals disc jet coupling on time scales of orders of hours. Second, the black hole XTEJ1118+480 shows complex correlations between the X-ray and optical emission. Those correlations are interpreted in terms of coupling between disc and jet on time scales of seconds or less. Those observations are discussed in the framework of current models.Comment: Invited talk at the Fifth Stromlo Symposium: Disks, Winds & Jets - from Planets to Quasars. Accepted for publication in Astrophysics & Space Scienc

    Modelling spectral and timing properties of accreting black holes: the hybrid hot flow paradigm

    Full text link
    The general picture that emerged by the end of 1990s from a large set of optical and X-ray, spectral and timing data was that the X-rays are produced in the innermost hot part of the accretion flow, while the optical/infrared (OIR) emission is mainly produced by the irradiated outer thin accretion disc. Recent multiwavelength observations of Galactic black hole transients show that the situation is not so simple. Fast variability in the OIR band, OIR excesses above the thermal emission and a complicated interplay between the X-ray and the OIR light curves imply that the OIR emitting region is much more compact. One of the popular hypotheses is that the jet contributes to the OIR emission and even is responsible for the bulk of the X-rays. However, this scenario is largely ad hoc and is in contradiction with many previously established facts. Alternatively, the hot accretion flow, known to be consistent with the X-ray spectral and timing data, is also a viable candidate to produce the OIR radiation. The hot-flow scenario naturally explains the power-law like OIR spectra, fast OIR variability and its complex relation to the X-rays if the hot flow contains non-thermal electrons (even in energetically negligible quantities), which are required by the presence of the MeV tail in Cyg X-1. The presence of non-thermal electrons also lowers the equilibrium electron temperature in the hot flow model to <100 keV, making it more consistent with observations. Here we argue that any viable model should simultaneously explain a large set of spectral and timing data and show that the hybrid (thermal/non-thermal) hot flow model satisfies most of the constraints.Comment: 26 pages, 13 figures. To be published in the Space Science Reviews and as hard cover in the Space Sciences Series of ISSI - The Physics of Accretion on to Black Holes (Springer Publisher

    The inverse problem of determining the filtration function and permeability reduction in flow of water with particles in porous media

    Get PDF
    The original publication can be found at www.springerlink.comDeep bed filtration of particle suspensions in porous media occurs during water injection into oil reservoirs, drilling fluid invasion of reservoir production zones, fines migration in oil fields, industrial filtering, bacteria, viruses or contaminants transport in groundwater etc. The basic features of the process are particle capture by the porous medium and consequent permeability reduction. Models for deep bed filtration contain two quantities that represent rock and fluid properties: the filtration function, which is the fraction of particles captured per unit particle path length, and formation damage function, which is the ratio between reduced and initial permeabilities. These quantities cannot be measured directly in the laboratory or in the field; therefore, they must be calculated indirectly by solving inverse problems. The practical petroleum and environmental engineering purpose is to predict injectivity loss and particle penetration depth around wells. Reliable prediction requires precise knowledge of these two coefficients. In this work we determine these quantities from pressure drop and effluent concentration histories measured in one-dimensional laboratory experiments. The recovery method consists of optimizing deviation functionals in appropriate subdomains; if necessary, a Tikhonov regularization term is added to the functional. The filtration function is recovered by optimizing a non-linear functional with box constraints; this functional involves the effluent concentration history. The permeability reduction is recovered likewise, taking into account the filtration function already found, and the functional involves the pressure drop history. In both cases, the functionals are derived from least square formulations of the deviation between experimental data and quantities predicted by the model.Alvarez, A. C., Hime, G., Marchesin, D., Bedrikovetski, P

    Achieving synergy: Linking an internet-based inflammatory bowel disease cohort to a community-based inception cohort and multicentered cohort in inflammatory bowel disease

    Get PDF
    Background: Traditional cohort studies are important contributors to our understanding of inflammatory bowel diseases, but they are labor intensive and often do not focus on patient-reported outcomes. Internet-based studies provide new opportunities to study patient-reported outcomes and can be efficiently implemented and scaled. If a traditional cohort study was linked to an Internet-based study, both studies could benefit from added synergy. Existing cohort studies provide an opportunity to develop and test processes for cohort linkage. The Crohn's and Colitis Foundation of America's (CCFA) Partners study is an Internet-based cohort of more than 14,000 participants. The Ocean State Crohn's and Colitis Area Registry (OSCCAR) is an inception cohort. The Sinai-Helmsley Alliance for Research Excellence (SHARE) is a multicentered cohort of inflammatory bowel disease patients. Both the later cohorts include medical record abstraction, patient surveys, and biospecimen collection. Objective: Given the complementary nature of these existing cohorts, we sought to corecruit and link data. Methods: Eligible OSCCAR and SHARE participants were invited to join the CCFA Partners study and provide consent for data sharing between the 2 cohorts. After informed consent, participants were directed to the CCFA Partners website to complete enrollment and a baseline Web-based survey. Participants were linked across the 2 cohorts by the matching of an email address. We compared demographic and clinical characteristics between OSCCAR and SHARE participants who did and did not enroll in CCFA Partners and the data linkage. Results: Of 408 participants in the OSCCAR cohort, 320 were eligible for participation in the CCFA Partners cohort. Of these participants, 243 consented to participation; however, only 44 enrolled in CCFA Partners and completed the linkage. OSCCAR participants who enrolled in CCFA Partners were better educated (17% with doctoral degrees) than those who did not (3% with doctoral degrees, P=.01). In the SHARE cohort, 436 participants enrolled and linked to the Partners cohort. More women (60% vs 50%) linked and those who linked were predominantly white (96%; P<.01). Crohn's disease patients who linked had lower mean scores on the Harvey-Bradshaw Index (3.6 vs 4.4, P<.01). Ulcerative colitis patients who linked had less extensive disease than those who did not link (45% vs 60%, P<.01). Conclusions: Linkage of CCFA Partners with cohorts such as OSCCAR and SHARE may be a cost-effective way to expand the infrastructure for clinical outcomes and translational research. Although linkage is feasible from a technical, legal, and regulatory perspective, participant willingness appears to be a limiting factor. Overcoming this barrier will be needed to generate meaningful sample sizes to conduct studies of biomarkers, natural history, and clinical effectiveness using linked data

    The Biochemical Abstract Machine {BIOCHAM}

    Get PDF
    http://www.springerlink.com/index/NVWWRAN9W4RUA03NIn this article we present the Biochemical Abstract Machine BIOCHAM and advocate its use as a formal modeling environment for networks biology. Biocham provides a precise semantics to biomolecular interaction maps. Based on this formal semantics, the Biocham system offers automated reasoning to ols for querying the temporal properties of the system under all its possible behavi ors. We present the main features of Biocham, provide details on a simple example of the MAPK signaling cascade and prove some results on the equivalence of model s w.r.t. their temporal properties

    Search for direct production of charginos and neutralinos in events with three leptons and missing transverse momentum in √s = 7 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for the direct production of charginos and neutralinos in final states with three electrons or muons and missing transverse momentum is presented. The analysis is based on 4.7 fb−1 of proton–proton collision data delivered by the Large Hadron Collider and recorded with the ATLAS detector. Observations are consistent with Standard Model expectations in three signal regions that are either depleted or enriched in Z-boson decays. Upper limits at 95% confidence level are set in R-parity conserving phenomenological minimal supersymmetric models and in simplified models, significantly extending previous results

    Jet size dependence of single jet suppression in lead-lead collisions at sqrt(s(NN)) = 2.76 TeV with the ATLAS detector at the LHC

    Get PDF
    Measurements of inclusive jet suppression in heavy ion collisions at the LHC provide direct sensitivity to the physics of jet quenching. In a sample of lead-lead collisions at sqrt(s) = 2.76 TeV corresponding to an integrated luminosity of approximately 7 inverse microbarns, ATLAS has measured jets with a calorimeter over the pseudorapidity interval |eta| < 2.1 and over the transverse momentum range 38 < pT < 210 GeV. Jets were reconstructed using the anti-kt algorithm with values for the distance parameter that determines the nominal jet radius of R = 0.2, 0.3, 0.4 and 0.5. The centrality dependence of the jet yield is characterized by the jet "central-to-peripheral ratio," Rcp. Jet production is found to be suppressed by approximately a factor of two in the 10% most central collisions relative to peripheral collisions. Rcp varies smoothly with centrality as characterized by the number of participating nucleons. The observed suppression is only weakly dependent on jet radius and transverse momentum. These results provide the first direct measurement of inclusive jet suppression in heavy ion collisions and complement previous measurements of dijet transverse energy imbalance at the LHC.Comment: 15 pages plus author list (30 pages total), 8 figures, 2 tables, submitted to Physics Letters B. All figures including auxiliary figures are available at http://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/PAPERS/HION-2011-02
    • 

    corecore