322 research outputs found

    Modeling and designing control chart for monitoring time-between events data

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    The Secondary Use of Longitudinal Critical Care Data

    Get PDF
    Aims To examine the strengths and limitations of a novel United Kingdom (UK) critical care data resource that repurposes routinely collected physiological data for research. Exemplar clinical research studies will be developed to explore the unique longitudinal nature of the resource. Objectives - To evaluate the suitability of the National Institute for Health Research (NIHR) Critical Care theme of the Health Informatics Collaborative (CCHIC) data model as a representation of the Electronic Health Record (EHR) for secondary research use. - To conduct a data quality evaluation of data stored within the CC-HIC research database. - To use the CC-HIC research database to conduct two clinical research studies that make use of the longitudinal data supported by the CC-HIC: - The association between cumulative exposure to excess oxygen and outcomes in the critically ill. - The association between different morphologies of longitudinal physiology—in particular organ dysfunction—and outcomes in sepsis. The CC-HIC The EHR is now routinely used for the delivery of patient care throughout the United Kingdom (UK). This has presented the opportunity to learn from a large volume of routinely collected data. The CC-HIC data model represents 255 distinct clinical concepts including demographics, outcomes and granular longitudinal physiology. This model is used to harmonise EHR data of 12 contributing Intensive Care Units (ICUs). This thesis evaluates the suitability of the CC-HIC data model in this role and the quality of data within. While representing an important first step in this field, the CC-HIC data model lacks the necessary normalisation and semantic expressivity to excel in this role. The quality of the CC-HIC research database was variable between contributing sites. High levels of missing data, missing meta-data, non-standardised units and temporal drop out of submitted data are amongst the most challenging features to tackle. It is the principal finding of this thesis that the CC-HIC should transition towards implementing internationally agreed standards for interoperability. Exemplar Clinical Studies Two exemplar studies are presented, each designed to make use of the longitudinal data made available by the CC-HIC and address domains that are both contemporaneous and of importance to the critical care community. Exposure to Excess Oxygen Longitudinal data from the CC-HIC cohort were used to explore the association between the cumulative exposure to excess oxygen and outcomes in the critically ill. A small (likely less than 1% absolute risk reduction) dose-independent association was found between exposure to excess oxygen and mortality. The lack of dosedependency challenges a causal interpretation of these findings. Physiological Morphologies in Sepsis The joint modelling paradigm was applied to explore the different longitudinal profiles of organ failure in sepsis, while accounting for informative censoring from patient death. The rate of change of organ failure was found to play a more significan't role in outcomes than the absolute value of organ failure at a given moment. This has important implications for how the critical care community views the evolution of physiology in sepsis. DECOVID The Decoding COVID-19 (DECOVID) project is presented as future work. DECOVID is a collaborative data sharing project that pools clinical data from two large NHS trusts in England. Many of the lessons learnt from the prior work with the CC-HIC fed into the development of the DECOVID data model and its quality evaluation

    Hazard rate models for early warranty issue detection using upstream supply chain information

    Get PDF
    This research presents a statistical methodology to construct an early automotive warranty issue detection model based on upstream supply chain information. This is contrary to extant methods that are mostly reactive and only rely on data available from the OEMs (original equipment manufacturers). For any upstream supply chain information with direct history from warranty claims, the research proposes hazard rate models to link upstream supply chain information as explanatory covariates for early detection of warranty issues. For any upstream supply chain information without direct warranty claims history, we introduce Bayesian hazard rate models to account for uncertainties of the explanatory covariates. In doing so, it improves both the accuracy of warranty issue detection as well as the lead time for detection. The proposed methodology is illustrated and validated using real-world data from a leading global Tier-one automotive supplier

    4th International Probabilistic Workshop: 12th-13th October 2006, Berlin, BAM (Federal Institute for Materials Research and Testing)

    Get PDF
    Die heutige Welt der Menschen wird durch große Dynamik geprägt. Eine Vielzahl verschiedener Prozesse entfaltet sich parallel und teilweise auf unsichtbare Weise miteinander verbunden. Nimmt man z.B. den Prozess der Globalisierung: Hier erleben wir ein exponentielles Wachstum der internationalen Verknüpfungen von der Ebene einzelner Menschen und bis zur Ebene der Kulturen. Solche Verknüpfungen führen uns zum Begriff der Komplexität. Diese wird oft als Produkt der Anzahl der Elemente eines Systems mal Umfang der Verknüpfungen im System verstanden. In anderen Worten, die Welt wird zunehmend komplexer, denn die Verknüpfungen nehmen zu. Komplexität wiederum ist ein Begriff für etwas unverstandenes, unkontrollierbares, etwas unbestimmtes. Genau wie bei einem Menschen: Aus einer Zelle wächst ein Mensch, dessen Verhalten wir im Detail nur schwer vorhersagen können. Immerhin besitzt sein Gehirn 1011 Elemente (Zellen). Wenn also diese dynamischen sozialen Prozesse zu höherer Komplexität führen, müssen wir auch mehr Unbestimmtheit erwarten. Es bleibt zu Hoffen, dass die Unbestimmtheit nicht existenzielle Grundlagen betrifft. Was die Komplexität der Technik angeht, so versucht man hier im Gegensatz zu den gesellschaftlichen Unsicherheiten die Unsicherheiten zu erfassen und gezielt mit ihnen umzugehen. Das gilt für alle Bereiche, ob nun Naturgefahrenmanagement, beim Bau und Betrieb von Kernkraftwerken, im Bauwesen oder in der Schifffahrt. Und so verschieden diese Fachgebiete auch scheinen mögen, die an diesem Symposium teilnehmen: Sie haben erkannt, das verantwortungsvoller Umgang mit Technik einer Berücksichtigung der Unbestimmtheit bedarf. Soweit sind wir in gesellschaftlichen Prozessen noch nicht. Wünschenswert wäre, dass in einigen Jahren nicht nur Bauingenieure, Maschinenbauer, Mathematiker oder Schiffsbauer an einem solchen Probabilistik- Symposium teilnehmen, sondern auch Soziologen, Politiker oder Manager... (aus dem Vorwort) --- HINWEIS: Das Volltextdokument besteht aus einzelnen Beiträgen mit separater Seitenzählung.PREFACE: The world today is shaped by high dynamics. Multitude of processes evolves parallel and partly connected invisible. For example, the globalisation is such a process. Here one can observe the exponential growing of connections form the level of single humans to the level of cultures. Such connections guide as to the term complexity. Complexity is often understood as product of the number of elements and the amount of connections in the system. In other words, the world is going more complex, if the connections increase. Complexity itself is a term for a system, which is not fully understood, which is partly uncontrollable and indeterminated: exactly as humans. Growing from a single cell, the humans will show latter a behaviour, which we can not predict in detail. After all, the human brain consists of 1011 elements (cells). If the social dynamical processes yield to more complexity, we have to accept more indetermination. Well, one has to hope, that such an indetermination does not affect the basic of human existence. If we look at the field of technology, we can detect, that here indetermination or uncertainty is often be dealt with explicitly. This is valid for natural risk management, for nuclear engineering, civil engineering or for the design of ships. And so different the fields are which contribute to this symposium for all is valid: People working in this field have realised, that a responsible usage of technology requires consideration of indetermination and uncertainty. This level is not yet reached in the social sciences. It is the wish of the organisers of this symposium, that not only civil engineers, mechanical engineers, mathematicians, ship builders take part in this symposium, but also sociologists, managers and even politicians. Therefore there is still a great opportunity to grow for this symposium. Indetermination does not have to be negative: it can also be seen as chance

    A systematic study on time between events control charts

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    A Quality Systems Economic-Risk Design Theoretical Framework

    Get PDF
    Quality systems, including control charts theory and sampling plans, have become essential tools to develop business processes. Since 1928, research has been conducted in developing the economic-risk designs for specific types of control charts or sampling plans. However, there has been no theoretical or applied research attempts to combine these related theories into a synthesized theoretical framework of quality systems economic-risk design. This research proposes to develop a theoretical framework of quality systems economic-risk design from qualitative research synthesis of the economic-risk design of sampling plan models and control charts models. This theoretical framework will be useful in guiding future research into economic risk quality systems design theory and application

    Application of Optimization in Production, Logistics, Inventory, Supply Chain Management and Block Chain

    Get PDF
    The evolution of industrial development since the 18th century is now experiencing the fourth industrial revolution. The effect of the development has propagated into almost every sector of the industry. From inventory to the circular economy, the effectiveness of technology has been fruitful for industry. The recent trends in research, with new ideas and methodologies, are included in this book. Several new ideas and business strategies are developed in the area of the supply chain management, logistics, optimization, and forecasting for the improvement of the economy of the society and the environment. The proposed technologies and ideas are either novel or help modify several other new ideas. Different real life problems with different dimensions are discussed in the book so that readers may connect with the recent issues in society and industry. The collection of the articles provides a glimpse into the new research trends in technology, business, and the environment

    Beurteilung der Resttragfähigkeit von Bauwerken mit Hilfe der Fuzzy-Logik und Entscheidungstheorie

    Get PDF
    Whereas the design of new structures is almost completely regulated by codes, there are no objective ways for the evaluation of existing facilities. Experts often are not familiar with the new tasks in system identification and try to retrieve at least some information from available documents. They therefore make compromises which, for many stakeholders, are not satisfying. Consequently, this publication presents a more objective and more realistic method for condition assessment. Necessary basics for this task are fracture mechanics combined with computational analysis, methods and techniques for geometry recording and material investigation, ductility and energy dissipation, risk analysis and uncertainty consideration. Present tools for evaluation perform research on how to analytically conceptualize a structure directly from given loads and measured response. Since defects are not necessarily visible or in a direct way detectable, several damage indices are combined and integrated in a model of the real system. Fuzzy-sets are ideally suited to illustrate parametric/data uncertainty and system- or model uncertainty. Trapezoidal membership functions may very well represent the condition state of structural components as function of damage extent or performance. Tthe residual load-bearing capacity can be determined by successively performing analyses in three steps. The "Screening assessment" shall eliminate a large majority of structures from detailed consideration and advise on immediate precautions to save lives and high economic values. Here, the defects have to be explicitly defined and located. If this is impossible, an "approximate evaluation" should follow describing system geometry, material properties and failure modes in detail. Here, a fault-tree helps investigate defaults in a systematic way avoiding random search or negligence of important features or damage indices. In order to inform about the structural system it is deemed essential not only due to its conceptual clarity, but also due to its applicational simplicity. It therefore represents an important prerequisite in condition assessment though special circumstances might require "fur-ther investigations" to consider the actual material parameters and unaccounted reserves due to spatial or other secondary contributions. Here, uncertainties with respect to geometry, material, loading or modeling should in no case be neglected, but explicitly quantified. Postulating a limited set of expected failure modes is not always sufficient, since detectable signature changes are seldom directly attributable and every defect might -together with other unforeseen situations- become decisive. So, a determination of all possible scenarios to consider every imaginable influence would be required. Risk is produced by a combination of various and ill-defined failure modes. Due to the interaction of many variables there is no simple and reliable way to predict which failure mode is dominant. Risk evaluation therefore comprises the estimation of the prognostic factor with respect to undesir-able events, component importance and the expected damage extent.Während die Bemessung von Tragwerken im allgemeinen durch Vorschriften geregelt ist, gibt es für die Zustandsbewertung bestehender Bauwerken noch keine objektiven Richtlinien. Viele Experten sind mit der neuen Problematik (Systemidentifikation anhand von Belastung und daraus entstehender Strukturantwort) noch nicht vertraut und begnügen sich daher mit Kompromißlösungen. Für viele Bauherren ist dies unbefriedigend, weshalb hier eine objektivere und wirklichkeitsnähere Zustandsbewertung vorgestellt wird. Wichtig hierfür sind theoretische Grundlagen der Schadensanalyse, Methoden und Techniken zur Geometrie- und Materialerkundung, Duktilität und Energieabsorption, Risikoanalyse und Beschreibung von Unsicherheiten. Da nicht alle Schäden offensichtlich sind, kombiniert man zur Zeit mehrere Zustandsindikatoren, bereitet die registrierten Daten gezielt auf, und integriert sie vor einer endgültigen Bewertung in ein validiertes Modell. Werden deterministische Nachweismethoden mit probabilstischen kombiniert, lassen sich nur zufällige Fehler problemlos minimieren. Systematische Fehler durch ungenaue Modellierung oder vagem Wissen bleiben jedoch bestehen. Daß Entscheidungsträger mit unsicheren, oft sogar widersprüchlichen Angaben subjektiv urteilen, ist also nicht zu vermeiden. In dieser Arbeit wird gezeigt, wie mit Hilfe eines dreistufigen Bewertungsverfahrens Tragglieder in Qualitätsklassen eingestuft werden können. Abhängig von ihrem mittleren Schadensausmaß, ihrer Strukturbedeutung I (wiederum von ihrem Stellenwert bzw. den Konsequenzen ihrer Schädigung abhängig) und ihrem Prognosefaktor L ergibt sich ihr Versagensrisiko mit. Das Risiko für eine Versagen der Gesamtstruktur wird aus der Topologie ermittelt. Wenn das mittlere Schadensausmaß nicht eindeutig festgelegt werden kann, oder wenn die Material-, Geometrie- oder Lastangaben vage sind, wird im Rahmen "Weitergehender Untersuchungen" ein mathematisches Verfahren basierend auf der Fuzzy-Logik vorgeschlagen. Es filtert auch bei komplexen Ursache-Wirkungsbeziehungen die dominierende Schadensursache heraus und vermeidet, daß mit Unsicherheiten behaftete Parameter für zuverlässige Absolutwerte gehalten werden. Um den mittleren Schadensindex und daraus das Risiko zu berechnen, werden die einzelnen Schadensindizes (je nach Fehlermodus) abhängig von ihrer Bedeutung mit Wichtungsfaktoren belegt,und zusätzlich je nach Art, Bedeutung und Zuverlässigkeit der erhaltenen Information durch Gamma dividiert. Hiermit wurde ein neues Verfahren zur Analyse komplexer Versagensmechanismen vorgestellt, welches nachvollziehbare Schlußfolgerungen ermöglicht

    Research Paper: Process Mining and Synthetic Health Data: Reflections and Lessons Learnt

    Get PDF
    Analysing the treatment pathways in real-world health data can provide valuable insight for clinicians and decision-makers. However, the procedures for acquiring real-world data for research can be restrictive, time-consuming and risks disclosing identifiable information. Synthetic data might enable representative analysis without direct access to sensitive data. In the first part of our paper, we propose an approach for grading synthetic data for process analysis based on its fidelity to relationships found in real-world data. In the second part, we apply our grading approach by assessing cancer patient pathways in a synthetic healthcare dataset (The Simulacrum provided by the English National Cancer Registration and Analysis Service) using process mining. Visualisations of the patient pathways within the synthetic data appear plausible, showing relationships between events confirmed in the underlying non-synthetic data. Data quality issues are also present within the synthetic data which reflect real-world problems and artefacts from the synthetic dataset’s creation. Process mining of synthetic data in healthcare is an emerging field with novel challenges. We conclude that researchers should be aware of the risks when extrapolating results produced from research on synthetic data to real-world scenarios and assess findings with analysts who are able to view the underlying data
    corecore