63 research outputs found

    Modelling thirty-day mortality in the acute respiratory distress syndrome (ARDS) in an adult ICU

    Get PDF
    Publisher's copy made available with the permission of the publisher © Australian Society of AnaesthetistsVariables predicting thirty-day outcome from Acute Respiratory Distress Syndrome (ARDS) were analysed using Cox regression structured for time-varying covariates. Over a three-year period, 1996-1998, consecutive patients with ARDS (bilateral chest X-ray opacities, PaO₂/FiO₂ ratio of <200 and an acute precipitating event) were identified using a prospective computerized data base in a university teaching hospital ICU. The cohort, 106 mechanically ventilated patients, was of mean (SD) age 63.5 (15.5) years and 37% were female. Primary lung injury occurred in 45% and 24% were postoperative. ICU-admission day APACHE II score was 25 (8); ARDS onset time from ICU admission was 1 day (median: range 0-16) and 30 day mortality was 41% (95% CI: 33%-51%). At ARDS onset, PaO₂/FiO₂ ratio was 92 (31), 81% had four-quadrant chest X-ray opacification and lung injury score was 2.75 (0.45). Average mechanical ventilator tidal volume was 10.3 ml/ predicted kg weight. Cox model mortality predictors (hazard ratio, 95% CI) were: APACHE II score, 1.15 (1.09-1.21); ARDS lag time (days), 0.72 (0.58-0.89); direct versus indirect injury, 2.89 (1.45-5.76); PaO₂/FiO₂ ratio, 0.98 (0.97-0.99); operative versus non-operative category, 0.24 (0.09-0.63). Time-varying effects were evident for PaO₂/FiO₂ ratio, operative versus non-operative category and ventilator tidal volume assessed as a categorical predictor with a cut-point of 8 ml/kg predicted weight (mean tidal volumes, 7.1 (1.9) vs 10.7 (1.6) ml/kg predicted weight). Thirty-day survival was improved for patients ventilated with lower tidal volumes. Survival predictors in ARDS were multifactorial and related to patient-injury-time interaction and level of mechanical ventilator tidal volume.J. L. Moran, P. J. Solomon, V. Fox, M. Salagaras, P. J. Williams, K. Quinlan, A. D. Berstenhttp://www.aaic.net.au/Article.asp?D=200332

    History of clinical transplantation

    Get PDF
    How transplantation came to be a clinical discipline can be pieced together by perusing two volumes of reminiscences collected by Paul I. Terasaki in 1991-1992 from many of the persons who were directly involved. One volume was devoted to the discovery of the major histocompatibility complex (MHC), with particular reference to the human leukocyte antigens (HLAs) that are widely used today for tissue matching.1 The other focused on milestones in the development of clinical transplantation.2 All the contributions described in both volumes can be traced back in one way or other to the demonstration in the mid-1940s by Peter Brian Medawar that the rejection of allografts is an immunological phenomenon.3,4 © 2008 Springer New York

    Passive Q-switching and mode-locking for the generation of nanosecond to femtosecond pulses

    Full text link

    Workflow activity monitoring using dynamics of pair-wise qualitative spatial relations

    No full text
    We present a method for real-time monitoring of workflows in a constrained environment. The monitoring system should not only be able to recognise the current step but also provide instructions about the possible next steps in an ongoing workflow. In this paper, we address this issue by using a robust approach (HMM-pLSA) which relies on a Hidden Markov Model (HMM) and generative model such as probabilistic Latent Semantic Analysis (pLSA). The proposed method exploits the dynamics of the qualitative spatial relation between pairs of objects involved in a workflow. The novel view-invariant relational feature is based on distance and its rate of change in 3D space. The multiple pair-wise relational features are represented in a multi-dimensional relational state space using an HMM. The workflow monitoring task is inferred from the relational state space using pLSA on datasets, which consist of workflow activities such as 'hammering nails' and 'driving screws'. The proposed approach is evaluated for both 'off-line' (complete observation) and 'on-line' (partial observation). The evaluation of the novel approach justifies the robustness of the technique in overcoming issues of noise evolving from object tracking and occlusions

    Blind separation of sparse sources using Jeffreys inverse prior and the EM algorithm

    No full text
    Abstract. In this paper we study the properties of the Jeffrey’s inverse prior for blind separation of sparse sources. This very sparse prior was previously used for Wavelet-based image denoising. In this paper we consider separation of 3×3 and2×3 noisy mixtures of audio signals, decomposed on a MDCT basis. The hierarchical formulation of the inverse prior allows for EM-based computation of MAP estimates. This procedure happens to be fast when compared to a standard more complex Markov chain Monte Carlo method using the flexible Student t prior, with competitive results obtained. Blind Source Separation (BSS) consists of estimating n signals (the sources) from the sole observation of m mixtures of them (the observations). If many efficient approaches exist for (over)determined (m ≥ n) non-noisy linear instantaneous, in particular within the field of Independent Component Analysis, the general linear instantaneous case, with mixtures possibly noisy and/or underdetermined (m &lt;n) is still a very challenging problem

    Agent-Based Parsimonious Decision Support Paradigm Employing Bayesian Belief Networks

    No full text
    This paper outlines the application of Bayesian technologies to CSF (Critical Success Factor) assessment for parsimonious military decision making using an agent-based decision support system. The research referred to in this paper is part of a funded project concerned with Smart Decision Support Systems (SDSS) within the General Dynamics led Data and Information Fusion Defence Technology Centre Consortium in the UK. An important factor for successful military missions is information superiority (IS). However, IS is not solely about minimising information related needs to avoid information overload and the reduction of bandwidth. It is concerned with creating information related capabilities that are aligned with achieving operational effects and raising operational tempo. Moreover good military decision making, agent based or otherwise, should take into account the uncertainty inherent in operational situations. While efficient information fusion may be achieved through the deployment of CSFs, Bayesian Belief Networks (BBNs) are employed to model uncertainty. This paper illustrates the application of CSF enabled BBN technology through an agent based paradigm for assessing the likelihood of success of military missions. BBNs are composed of two parts the quantitative and the qualitative. The former models the dependencies between the various random events and the latter the prior domain knowledge embedded in the network in the form of conditional probability tables (CPTs). Modelling prior knowledge in a BBN is a complex and time consuming task and sometimes intractable when the number of nodes and states of the network increases. This paper describes a method that enables the automated configuration of conditional probability tables from hard data generated from simulations of military operational scenarios using a computer generated forces (CGF) synthetic environment

    Evolution of Digital Filters Using a Gate Array Model

    No full text
    . The traditional paradigm for digital filter design is based on the concept of a linear difference equation with the output response being a weighted sum of signal samples with usually floating point coefficients. Unfortunately such a model is necessarily expensive in terms of hardware as it requires many large bit additions and multiplications. In this paper it is shown how it is possible to evolve a small rectangular array of logic gates to perform low pass FIR filtering. The circuit is evolved by assessing its response to digitised pure sine waves. The evolved circuit is demonstrated to possess nearly linear properties, which means that it is capable of filtering composite signals which it has never seen before. 1 Introduction The difference equation is a fundamental concept employed in the construction and analysis of digital filters [8]. Formally this is represented in the following way. The output of the filter at time n, y(n), may be a function of N samples of the signal x(n-i..
    corecore