1,113 research outputs found

    Lossy and Lossless Video Frame Compression: A Novel Approach for the High-Temporal Video Data Analytics

    Get PDF
    The smart city concept has attracted high research attention in recent years within diverse application domains, such as crime suspect identification, border security, transportation, aerospace, and so on. Specific focus has been on increased automation using data driven approaches, while leveraging remote sensing and real-time streaming of heterogenous data from various resources, including unmanned aerial vehicles, surveillance cameras, and low-earth-orbit satellites. One of the core challenges in exploitation of such high temporal data streams, specifically videos, is the trade-off between the quality of video streaming and limited transmission bandwidth. An optimal compromise is needed between video quality and subsequently, recognition and understanding and efficient processing of large amounts of video data. This research proposes a novel unified approach to lossy and lossless video frame compression, which is beneficial for the autonomous processing and enhanced representation of high-resolution video data in various domains. The proposed fast block matching motion estimation technique, namely mean predictive block matching, is based on the principle that general motion in any video frame is usually coherent. This coherent nature of the video frames dictates a high probability of a macroblock having the same direction of motion as the macroblocks surrounding it. The technique employs the partial distortion elimination algorithm to condense the exploration time, where partial summation of the matching distortion between the current macroblock and its contender ones will be used, when the matching distortion surpasses the current lowest error. Experimental results demonstrate the superiority of the proposed approach over state-of-the-art techniques, including the four step search, three step search, diamond search, and new three step search

    Convergence of Intelligent Data Acquisition and Advanced Computing Systems

    Get PDF
    This book is a collection of published articles from the Sensors Special Issue on "Convergence of Intelligent Data Acquisition and Advanced Computing Systems". It includes extended versions of the conference contributions from the 10th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications (IDAACS’2019), Metz, France, as well as external contributions

    Modelling Immunological Memory

    Get PDF
    Accurate immunological models offer the possibility of performing highthroughput experiments in silico that can predict, or at least suggest, in vivo phenomena. In this chapter, we compare various models of immunological memory. We first validate an experimental immunological simulator, developed by the authors, by simulating several theories of immunological memory with known results. We then use the same system to evaluate the predicted effects of a theory of immunological memory. The resulting model has not been explored before in artificial immune systems research, and we compare the simulated in silico output with in vivo measurements. Although the theory appears valid, we suggest that there are a common set of reasons why immunological memory models are a useful support tool; not conclusive in themselves

    The Virtual Electrode Recording Tool for EXtracellular Potentials (VERTEX) Version 2.0: Modelling in vitro electrical stimulation of brain tissue [version 1; peer review: 2 approved]

    Get PDF
    Neuronal circuits can be modelled in detail allowing us to predict the effects of stimulation on individual neurons. Electrical stimulation of neuronal circuits in vitro and in vivo excites a range of neurons within the tissue and measurements of neural activity, e.g the local field potential (LFP), are again an aggregate of a large pool of cells. The previous version of our Virtual Electrode Recording Tool for EXtracellular Potentials (VERTEX) allowed for the simulation of the LFP generated by a patch of brain tissue. Here, we extend VERTEX to simulate the effect of electrical stimulation through a focal electric field. We observe both direct changes in neural activity and changes in synaptic plasticity. Testing our software in a model of a rat neocortical slice, we determine the currents contributing to the LFP, the effects of paired pulse stimulation to induce short term plasticity (STP), and the effect of theta burst stimulation (TBS) to induce long term potentiation (LTP)

    Deep learning-based methods for prostate segmentation in magnetic resonance imaging

    Get PDF
    Magnetic Resonance Imaging-based prostate segmentation is an essential task for adaptive radiotherapy and for radiomics studies whose purpose is to identify associations between imaging features and patient outcomes. Because manual delineation is a time-consuming task, we present three deep-learning (DL) approaches, namely UNet, efficient neural network (ENet), and efficient residual factorized convNet (ERFNet), whose aim is to tackle the fully-automated, real-time, and 3D delineation process of the prostate gland on T2-weighted MRI. While UNet is used in many biomedical image delineation applications, ENet and ERFNet are mainly applied in self-driving cars to compensate for limited hardware availability while still achieving accurate segmentation. We apply these models to a limited set of 85 manual prostate segmentations using the k-fold validation strategy and the Tversky loss function and we compare their results. We find that ENet and UNet are more accurate than ERFNet, with ENet much faster than UNet. Specifically, ENet obtains a dice similarity coefficient of 90.89% and a segmentation time of about 6 s using central processing unit (CPU) hardware to simulate real clinical conditions where graphics processing unit (GPU) is not always available. In conclusion, ENet could be efficiently applied for prostate delineation even in small image training datasets with potential benefit for patient management personalization

    Model-Based Fault Detection and Identification for Prognostics of Electromechanical Actuators Using Genetic Algorithms

    Get PDF
    Traditional hydraulic servomechanisms for aircraft control surfaces are being gradually replaced by newer technologies, such as Electro-Mechanical Actuators (EMAs). Since field data about reliability of EMAs are not available due to their recent adoption, their failure modes are not fully understood yet; therefore, an effective prognostic tool could help detect incipient failures of the flight control system, in order to properly schedule maintenance interventions and replacement of the actuators. A twofold benefit would be achieved: Safety would be improved by avoiding the aircraft to fly with damaged components, and replacement of still functional components would be prevented, reducing maintenance costs. However, EMA prognostic presents a challenge due to the complexity and to the multi-disciplinary nature of the monitored systems. We propose a model-based fault detection and isolation (FDI) method, employing a Genetic Algorithm (GA) to identify failure precursors before the performance of the system starts being compromised. Four different failure modes are considered: dry friction, backlash, partial coil short circuit, and controller gain drift. The method presented in this work is able to deal with the challenge leveraging the system design knowledge in a more effective way than data-driven strategies, and requires less experimental data. To test the proposed tool, a simulated test rig was developed. Two numerical models of the EMA were implemented with different level of detail: A high fidelity model provided the data of the faulty actuator to be analyzed, while a simpler one, computationally lighter but accurate enough to simulate the considered fault modes, was executed iteratively by the GA. The results showed good robustness and precision, allowing the early identification of a system malfunctioning with few false positives or missed failures.https://susy.mdpi

    A Novel Coupled Reaction-Diffusion System for Explainable Gene Expression Profiling

    Get PDF
    Machine learning (ML)-based algorithms are playing an important role in cancer diagnosis and are increasingly being used to aid clinical decision-making. However, these commonly operate as ‘black boxes’ and it is unclear how decisions are derived. Recently, techniques have been applied to help us understand how specific ML models work and explain the rational for outputs. This study aims to determine why a given type of cancer has a certain phenotypic characteristic. Cancer results in cellular dysregulation and a thorough consideration of cancer regulators is required. This would increase our understanding of the nature of the disease and help discover more effective diagnostic, prognostic, and treatment methods for a variety of cancer types and stages. Our study proposes a novel explainable analysis of potential biomarkers denoting tumorigenesis in non-small cell lung cancer. A number of these biomarkers are known to appear following various treatment pathways. An enhanced analysis is enabled through a novel mathematical formulation for the regulators of mRNA, the regulators of ncRNA, and the coupled mRNA–ncRNA regulators. Temporal gene expression profiles are approximated in a two-dimensional spatial domain for the transition states before converging to the stationary state, using a system comprised of coupled-reaction partial differential equations. Simulation experiments demonstrate that the proposed mathematical gene-expression profile represents a best fit for the population abundance of these oncogenes. In future, our proposed solution can lead to the development of alternative interpretable approaches, through the application of ML models to discover unknown dynamics in gene regulatory systems

    Optimization and Control of Agent-Based Models in Biology: A Perspective

    Get PDF
    Agent-based models (ABMs) have become an increasingly important mode of inquiry for the life sciences. They are particularly valuable for systems that are not understood well enough to build an equation-based model. These advantages, however, are counterbalanced by the difficulty of analyzing and using ABMs, due to the lack of the type of mathematical tools available for more traditional models, which leaves simulation as the primary approach. As models become large, simulation becomes challenging. This paper proposes a novel approach to two mathematical aspects of ABMs, optimization and control, and it presents a few first steps outlining how one might carry out this approach. Rather than viewing the ABM as a model, it is to be viewed as a surrogate for the actual system. For a given optimization or control problem (which may change over time), the surrogate system is modeled instead, using data from the ABM and a modeling framework for which ready-made mathematical tools exist, such as differential equations, or for which control strategies can explored more easily. Once the optimization problem is solved for the model of the surrogate, it is then lifted to the surrogate and tested. The final step is to lift the optimization solution from the surrogate system to the actual system. This program is illustrated with published work, using two relatively simple ABMs as a demonstration, Sugarscape and a consumer-resource ABM. Specific techniques discussed include dimension reduction and approximation of an ABM by difference equations as well systems of PDEs, related to certain specific control objectives. This demonstration illustrates the very challenging mathematical problems that need to be solved before this approach can be realistically applied to complex and large ABMs, current and future. The paper outlines a research program to address them

    Predictors of individual response to placebo or tadalafil 5mg among men with lower urinary tract symptoms secondary to benign prostatic hyperplasia: An integrated clinical data mining analysis

    Get PDF
    Background: A significant percentage of patients with lower urinary tract symptoms (LUTS) secondary to benign prostatic hyperplasia (BPH) achieve clinically meaningful improvement when receiving placebo or tadalafil 5mg once daily. However, individual patient characteristics associated with treatment response are unknown. Methods: This integrated clinical data mining analysis was designed to identify factors associated with a clinically meaningful response to placebo or tadalafil 5mg once daily in an individual patient with LUTS-BPH. Analyses were performed on pooled data from four randomized, placebo-controlled, double-blind, clinical studies, including about 1,500 patients, from which 107 baseline characteristics were selected and 8 response criteria. The split set evaluation method (1,000 repeats) was used to estimate prediction accuracy, with the database randomly split into training and test subsets. Logistic Regression (LR), Decision Tree (DT), Support Vector Machine (SVM) and Random Forest (RF) models were then generated on the training subset and used to predict response in the test subset. Prediction models were generated for placebo and tadalafil 5mg once daily Receiver Operating Curve (ROC) analysis was used to select optimal prediction models lying on the ROC surface. Findings: International Prostate Symptom Score (IPSS) baseline group (mild/moderate vs. severe) for active treatment and placebo achieved the highest combined sensitivity and specificity of 70% and ∌50%for all analyses, respectively. This was below the sensitivity and specificity threshold of 80% that would enable reliable allocation of an individual patient to either the responder or non-responder group Conclusions: This extensive clinical data mining study in LUTS-BPH did not identify baseline clinical or demographic characteristics that were sufficiently predictive of an individual patient response to placebo or once daily tadalafil 5mg. However, the study reaffirms the efficacy of tadalalfil 5mg once daily in the treatment of LUTS-BPH in the majority of patients and the importance of evaluating individual patient need in selecting the most appropriate treatment. Copyright

    G-CSC Report 2010

    Get PDF
    The present report gives a short summary of the research of the Goethe Center for Scientific Computing (G-CSC) of the Goethe University Frankfurt. G-CSC aims at developing and applying methods and tools for modelling and numerical simulation of problems from empirical science and technology. In particular, fast solvers for partial differential equations (i.e. pde) such as robust, parallel, and adaptive multigrid methods and numerical methods for stochastic differential equations are developed. These methods are highly adanvced and allow to solve complex problems.. The G-CSC is organised in departments and interdisciplinary research groups. Departments are localised directly at the G-CSC, while the task of interdisciplinary research groups is to bridge disciplines and to bring scientists form different departments together. Currently, G-CSC consists of the department Simulation and Modelling and the interdisciplinary research group Computational Finance
    • 

    corecore