372 research outputs found

    A model checking approach to the parameter estimation of biochemical pathways

    Get PDF
    Model checking has historically been an important tool to verify models of a wide variety of systems. Typically a model has to exhibit certain properties to be classed ‘acceptable’. In this work we use model checking in a new setting; parameter estimation. We characterise the desired behaviour of a model in a temporal logic property and alter the model to make it conform to the property (determined through model checking). We have implemented a computational system called MC2(GA) which pairs a model checker with a genetic algorithm. To drive parameter estimation, the fitness of set of parameters in a model is the inverse of the distance between its actual behaviour and the desired behaviour. The model checker used is the simulation-based Monte Carlo Model Checker for Probabilistic Linear-time Temporal Logic with numerical constraints, MC2(PLTLc). Numerical constraints as well as the overall probability of the behaviour expressed in temporal logic are used to minimise the behavioural distance. We define the theory underlying our parameter estimation approach in both the stochastic and continuous worlds. We apply our approach to biochemical systems and present an illustrative example where we estimate the kinetic rate constants in a continuous model of a signalling pathway

    Variable length-based genetic representation to automatically evolve wrappers

    Full text link
    The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-642-12433-4_44Proceedings 8th International Conference on Practical Applications of Agents and Multiagent SystemsThe Web has been the star service on the Internet, however the outsized information available and its decentralized nature has originated an intrinsic difficulty to locate, extract and compose information. An automatic approach is required to handle with this huge amount of data. In this paper we present a machine learning algorithm based on Genetic Algorithms which generates a set of complex wrappers, able to extract information from theWeb. The paper presents the experimental evaluation of these wrappers over a set of basic data sets.This work has been partially supported by the Spanish Ministry of Science and Innovation under the projects Castilla-La Mancha project PEII09-0266-6640, COMPUBIODIVE (TIN2007-65989), and by V-LeaF (TIN2008-02729-E/TIN)

    Evaluation of a general practice based Hepatitis C virus screening intervention

    Get PDF
    In 2003 an estimated 37,500 of Scotland's population was chronically infected with HCV; 44% were undiagnosed former injecting drug users (IDU) - a priority group for arrival therapy. Aims to evaluate a hepatitis C virus (HCV) screening intervention. Outcomes measures among two similar general practice populations in an area of high HCV and drug use prevalence, one of which was exposed to an HCV screening intervention, were compared. Thirty to fifty four year old attendees of the intervention practice were opportunistically offered testing and counselling, where clinically appropriate, (November 2003 - April 2004). Outcomes: HCV test uptake, case detection, referral and treatment administration rates. Of 584 eligible attendees, 421 (72%) were offered and 117 (28%) accepted testing in the intervention practice; no testing was undertaken in the comparison practice. Prevalences of HCV antibody were 13% (15/117), 75% (3/4) and 91% (10/11) among all tested persons, current IDUs and former IDUs respectively. For 4/15 (27%) evidence of binge drinking following the receipt of their positive result, was available. Of the 11 referred to specialist care because they were HCV RNA positive, nine attended at least one appointment. Two received treatment: one had achieved a sustained viral response as of February 2008. While non targeted HCV screening in the general practice setting can detect infected former IDU, the low diagnostic yield among non IDUs limited the effectiveness of the intervention. A more targeted approach for identifying former IDUs is recommended. Additionally, the low uptake of treatment among chronically infected persons four years after diagnosis demonstrates the difficulties in clinically managing such individuals. Strategies, including support for those with a history of problem alcohol use, to improve treatment uptake are required

    Flow Faster: Efficient Decision Algorithms for Probabilistic Simulations

    Get PDF
    Strong and weak simulation relations have been proposed for Markov chains, while strong simulation and strong probabilistic simulation relations have been proposed for probabilistic automata. However, decision algorithms for strong and weak simulation over Markov chains, and for strong simulation over probabilistic automata are not efficient, which makes it as yet unclear whether they can be used as effectively as their non-probabilistic counterparts. This paper presents drastically improved algorithms to decide whether some (discrete- or continuous-time) Markov chain strongly or weakly simulates another, or whether a probabilistic automaton strongly simulates another. The key innovation is the use of parametric maximum flow techniques to amortize computations. We also present a novel algorithm for deciding strong probabilistic simulation preorders on probabilistic automata, which has polynomial complexity via a reduction to an LP problem. When extending the algorithms for probabilistic automata to their continuous-time counterpart, we retain the same complexity for both strong and strong probabilistic simulations.Comment: LMC

    Benchmark Parameters for CMB Polarization Experiments

    Full text link
    The recently detected polarization of the cosmic microwave background (CMB) holds the potential for revealing the physics of inflation and gravitationally mapping the large-scale structure of the universe, if so called B-mode signals below 10^{-7}, or tenths of a uK, can be reliably detected. We provide a language for describing systematic effects which distort the observed CMB temperature and polarization fields and so contaminate the B-modes. We identify 7 types of effects, described by 11 distortion fields, and show their association with known instrumental systematics such as common mode and differential gain fluctuations, line cross-coupling, pointing errors, and differential polarized beam effects. Because of aliasing from the small-scale structure in the CMB, even uncorrelated fluctuations in these effects can affect the large-scale B modes relevant to gravitational waves. Many of these problems are greatly reduced by having an instrumental beam that resolves the primary anisotropies (FWHM << 10'). To reach the ultimate goal of an inflationary energy scale of 3 \times 10^{15} GeV, polarization distortion fluctuations must be controlled at the 10^{-2}-10^{-3} level and temperature leakage to the 10^{-4}-10^{-3} level depending on effect. For example pointing errors must be controlled to 1.5'' rms for arcminute scale beams or a percent of the Gaussian beam width for larger beams; low spatial frequency differential gain fluctuations or line cross-coupling must be eliminated at the level of 10^{-4} rms.Comment: 11 pages, 5 figures, submitted to PR

    How spiking neurons give rise to a temporal-feature map

    Get PDF
    A temporal-feature map is a topographic neuronal representation of temporal attributes of phenomena or objects that occur in the outside world. We explain the evolution of such maps by means of a spike-based Hebbian learning rule in conjunction with a presynaptically unspecific contribution in that, if a synapse changes, then all other synapses connected to the same axon change by a small fraction as well. The learning equation is solved for the case of an array of Poisson neurons. We discuss the evolution of a temporal-feature map and the synchronization of the single cells’ synaptic structures, in dependence upon the strength of presynaptic unspecific learning. We also give an upper bound for the magnitude of the presynaptic interaction by estimating its impact on the noise level of synaptic growth. Finally, we compare the results with those obtained from a learning equation for nonlinear neurons and show that synaptic structure formation may profit from the nonlinearity

    On derivation of Euler-Lagrange Equations for incompressible energy-minimizers

    Get PDF
    We prove that any distribution qq satisfying the equation ∇q=Ă·f\nabla q=\div{\bf f} for some tensor f=(fji),fji∈hr(U){\bf f}=(f^i_j), f^i_j\in h^r(U) (1≀r<∞1\leq r<\infty) -the {\it local Hardy space}, qq is in hrh^r, and is locally represented by the sum of singular integrals of fjif^i_j with Calder\'on-Zygmund kernel. As a consequence, we prove the existence and the local representation of the hydrostatic pressure pp (modulo constant) associated with incompressible elastic energy-minimizing deformation u{\bf u} satisfying ∣∇u∣2,∣cof∇u∣2∈h1|\nabla {\bf u}|^2, |{\rm cof}\nabla{\bf u}|^2\in h^1. We also derive the system of Euler-Lagrange equations for incompressible local minimizers u{\bf u} that are in the space Kloc1,3K^{1,3}_{\rm loc}; partially resolving a long standing problem. For H\"older continuous pressure pp, we obtain partial regularity of area-preserving minimizers.Comment: 23 page

    Maternal distress and perceptions of infant development following extracorporeal membrane oxygenation and conventional ventilation for persistent pulmonary hypertension

    Full text link
    Neurodevelopmental outcome and concurrent maternal distress were examined for infants who suffered persistent pulmonary hypertension at birth and were treated with either extracorporeal membrane oxygenation (ECMO) ( n = 19) or conventional ventilation (CV) ( n = 15). Mothers were asked to complete inventories assessing their infant's (mean age 8.74 months) developmental growth as well as their own psychological health. Relevant sociodemographic and treatment parameters were also entered into the analysis. The results indicated that ECMO and CV infants did not differ on developmental indices and impairment rates were 15–23% respectively, similar to previous reports, in addition, ECMO and CV mothers did not differ in their reports of psychological distress. Correlational analyses revealed that length of treatment for ECMO but not CV infants significantly predicted developmental delay and maternal distress. For CV mothers, maternal distress was associated with the perception of delayed language. The results are discussed in terms of the limited morbidity associated with ECMO and CV interventions and the possible role of a ‘vulnerable child syndrome’ in understanding the maternal-infant relationship following ECMO therapy.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/73367/1/j.1365-2214.1995.tb00410.x.pd

    Search for the standard model Higgs boson decaying into two photons in pp collisions at sqrt(s)=7 TeV

    Get PDF
    A search for a Higgs boson decaying into two photons is described. The analysis is performed using a dataset recorded by the CMS experiment at the LHC from pp collisions at a centre-of-mass energy of 7 TeV, which corresponds to an integrated luminosity of 4.8 inverse femtobarns. Limits are set on the cross section of the standard model Higgs boson decaying to two photons. The expected exclusion limit at 95% confidence level is between 1.4 and 2.4 times the standard model cross section in the mass range between 110 and 150 GeV. The analysis of the data excludes, at 95% confidence level, the standard model Higgs boson decaying into two photons in the mass range 128 to 132 GeV. The largest excess of events above the expected standard model background is observed for a Higgs boson mass hypothesis of 124 GeV with a local significance of 3.1 sigma. The global significance of observing an excess with a local significance greater than 3.1 sigma anywhere in the search range 110-150 GeV is estimated to be 1.8 sigma. More data are required to ascertain the origin of this excess.Comment: Submitted to Physics Letters
    • 

    corecore