720 research outputs found

    Improved quantification of perfusion in patients with cerebrovascular disease.

    Get PDF
    In recent years measurements of cerebral perfusion using bolus-tracking MRI have become common clinical practice in the diagnosis and management of patients with stroke and cerebrovascular disease. An active area of research is the development of methods to identify brain tissue that is at risk of irreversible damage, but amenable to salvage using reperfusion treatments, such as thrombolysis. However, the specificity and sensitivity of these methods are limited by the inaccuracies in the perfusion data. Accurate measurements of perfusion are difficult to obtain, especially in patients with cerebrovascular diseases. In particular, if the bolus of MR contrast is delayed and/or dispersed due to cerebral arterial abnormalities, perfusion is likely to be underestimated using the standard analysis techniques. The potential for such underestimation is often overlooked when using the perfusion maps to assess stroke patients. Since thrombolysis can increase the risk of haemorrhage, a misidentification of 'at-risk' tissue has potentially dangerous clinical implications. This thesis presents several methodologies which aim to improve the accuracy and interpretation of the analysed bolus-tracking data. Two novel data analysis techniques are proposed, which enable the identification of brain regions where delay and dispersion of the bolus are likely to bias the perfusion measurements. In this way true hypoperfusion can be distinguished from erroneously low perfusion estimates. The size of the perfusion measurement errors are investigated in vivo, and a parameterised characterisation of the bolus delay and dispersion is obtained. Such information is valuable for the interpretation of in vivo data, and for further investigation into the effects of abnormal vasculature on perfusion estimates. Finally, methodology is presented to minimise the perfusion measurement errors prevalent in patients with cerebrovascular diseases. The in vivo application of this method highlights the dangers of interpreting perfusion values independently of the bolus delay and dispersion

    Establishment of a novel predictive reliability assessment strategy for ship machinery

    Get PDF
    There is no doubt that recent years, maritime industry is moving forward to novel and sophisticated inspection and maintenance practices. Nowadays maintenance is encountered as an operational method, which can be employed both as a profit generating process and a cost reduction budget centre through an enhanced Operation and Maintenance (O&M) strategy. In the first place, a flexible framework to be applicable on complex system level of machinery can be introduced towards ship maintenance scheduling of systems, subsystems and components.;This holistic inspection and maintenance notion should be implemented by integrating different strategies, methodologies, technologies and tools, suitably selected by fulfilling the requirements of the selected ship systems. In this thesis, an innovative maintenance strategy for ship machinery is proposed, namely the Probabilistic Machinery Reliability Assessment (PMRA) strategy focusing towards the reliability and safety enhancement of main systems, subsystems and maintainable units and components.;In this respect, the combination of a data mining method (k-means), the manufacturer safety aspects, the dynamic state modelling (Markov Chains), the probabilistic predictive reliability assessment (Bayesian Belief Networks) and the qualitative decision making (Failure Modes and Effects Analysis) is employed encompassing the benefits of qualitative and quantitative reliability assessment. PMRA has been clearly demonstrated in two case studies applied on offshore platform oil and gas and selected ship machinery.;The results are used to identify the most unreliability systems, subsystems and components, while advising suitable practical inspection and maintenance activities. The proposed PMRA strategy is also tested in a flexible sensitivity analysis scheme.There is no doubt that recent years, maritime industry is moving forward to novel and sophisticated inspection and maintenance practices. Nowadays maintenance is encountered as an operational method, which can be employed both as a profit generating process and a cost reduction budget centre through an enhanced Operation and Maintenance (O&M) strategy. In the first place, a flexible framework to be applicable on complex system level of machinery can be introduced towards ship maintenance scheduling of systems, subsystems and components.;This holistic inspection and maintenance notion should be implemented by integrating different strategies, methodologies, technologies and tools, suitably selected by fulfilling the requirements of the selected ship systems. In this thesis, an innovative maintenance strategy for ship machinery is proposed, namely the Probabilistic Machinery Reliability Assessment (PMRA) strategy focusing towards the reliability and safety enhancement of main systems, subsystems and maintainable units and components.;In this respect, the combination of a data mining method (k-means), the manufacturer safety aspects, the dynamic state modelling (Markov Chains), the probabilistic predictive reliability assessment (Bayesian Belief Networks) and the qualitative decision making (Failure Modes and Effects Analysis) is employed encompassing the benefits of qualitative and quantitative reliability assessment. PMRA has been clearly demonstrated in two case studies applied on offshore platform oil and gas and selected ship machinery.;The results are used to identify the most unreliability systems, subsystems and components, while advising suitable practical inspection and maintenance activities. The proposed PMRA strategy is also tested in a flexible sensitivity analysis scheme

    20th SC@RUG 2023 proceedings 2022-2023

    Get PDF

    20th SC@RUG 2023 proceedings 2022-2023

    Get PDF

    New Methods for Network Traffic Anomaly Detection

    Get PDF
    In this thesis we examine the efficacy of applying outlier detection techniques to understand the behaviour of anomalies in communication network traffic. We have identified several shortcomings. Our most finding is that known techniques either focus on characterizing the spatial or temporal behaviour of traffic but rarely both. For example DoS attacks are anomalies which violate temporal patterns while port scans violate the spatial equilibrium of network traffic. To address this observed weakness we have designed a new method for outlier detection based spectral decomposition of the Hankel matrix. The Hankel matrix is spatio-temporal correlation matrix and has been used in many other domains including climate data analysis and econometrics. Using our approach we can seamlessly integrate the discovery of both spatial and temporal anomalies. Comparison with other state of the art methods in the networks community confirms that our approach can discover both DoS and port scan attacks. The spectral decomposition of the Hankel matrix is closely tied to the problem of inference in Linear Dynamical Systems (LDS). We introduce a new problem, the Online Selective Anomaly Detection (OSAD) problem, to model the situation where the objective is to report new anomalies in the system and suppress know faults. For example, in the network setting an operator may be interested in triggering an alarm for malicious attacks but not on faults caused by equipment failure. In order to solve OSAD we combine techniques from machine learning and control theory in a unique fashion. Machine Learning ideas are used to learn the parameters of an underlying data generating system. Control theory techniques are used to model the feedback and modify the residual generated by the data generating state model. Experiments on synthetic and real data sets confirm that the OSAD problem captures a general scenario and tightly integrates machine learning and control theory to solve a practical problem

    Advancements in Real-Time Simulation of Power and Energy Systems

    Get PDF
    Modern power and energy systems are characterized by the wide integration of distributed generation, storage and electric vehicles, adoption of ICT solutions, and interconnection of different energy carriers and consumer engagement, posing new challenges and creating new opportunities. Advanced testing and validation methods are needed to efficiently validate power equipment and controls in the contemporary complex environment and support the transition to a cleaner and sustainable energy system. Real-time hardware-in-the-loop (HIL) simulation has proven to be an effective method for validating and de-risking power system equipment in highly realistic, flexible, and repeatable conditions. Controller hardware-in-the-loop (CHIL) and power hardware-in-the-loop (PHIL) are the two main HIL simulation methods used in industry and academia that contribute to system-level testing enhancement by exploiting the flexibility of digital simulations in testing actual controllers and power equipment. This book addresses recent advances in real-time HIL simulation in several domains (also in new and promising areas), including technique improvements to promote its wider use. It is composed of 14 papers dealing with advances in HIL testing of power electronic converters, power system protection, modeling for real-time digital simulation, co-simulation, geographically distributed HIL, and multiphysics HIL, among other topics

    Time-Resolved Method for Spectral Analysis based on Linear Predictive Coding, with Application to EEG Analysis

    Get PDF
    EEG (Electroencephalogram) signal is a biological signal in BCI (Brain-Computer Interface) systems to realise the information exchange between the brain and the external environment. It is characterised by a poor signal-to-noise ratio, is time-varying, is intermittent and contains multiple frequency components. This research work has developed a new parameterised time-frequency method called the Linear Predictive Coding Pole Processing (LPCPP) method which can be used for identifying and tracking the dominant frequency components of an EEG signal. The LPCPP method further processes LPC (Linear Predictive Coding) poles to produce a series of reduced-order filter transfer functions to estimate the dominant frequencies. It is suited for processing high-noise multi-component signals and can directly give the corresponding frequency estimates unlike transform-based methods. Furthermore, a new EEG spectral analysis framework involving the LPCPP method is proposed to describe the EEG spectral activity. The EEG signal has been divided into different frequency bands (i.e. Delta, Theta, Alpha, Beta and Gamma). However, there is no consensus on the definitions of these band boundaries. A series of EEG centre frequencies are proposed in this thesis instead of fixed frequency boundaries, as they are better suited to describe the dominant EEG spectral activity
    corecore