14 research outputs found

    An accurate and time-efficient deep learning-based system for automated segmentation and reporting of cardiac magnetic resonance-detected ischemic scar

    Get PDF
    Background and objectives: Myocardial infarction scar (MIS) assessment by cardiac magnetic resonance provides prognostic information and guides patients' clinical management. However, MIS segmentation is time-consuming and not performed routinely. This study presents a deep-learning-based computational workflow for the segmentation of left ventricular (LV) MIS, for the first time performed on state-of-the-art dark-blood late gadolinium enhancement (DB-LGE) images, and the computation of MIS transmurality and extent.Methods: DB-LGE short-axis images of consecutive patients with myocardial infarction were acquired at 1.5T in two centres between Jan 1, 2019, and June 1, 2021. Two convolutional neural network (CNN) mod-els based on the U-Net architecture were trained to sequentially segment the LV and MIS, by processing an incoming series of DB-LGE images. A 5-fold cross-validation was performed to assess the performance of the models. Model outputs were compared respectively with manual (LV endo-and epicardial border) and semi-automated (MIS, 4-Standard Deviation technique) ground truth to assess the accuracy of the segmentation. An automated post-processing and reporting tool was developed, computing MIS extent (expressed as relative infarcted mass) and transmurality.Results: The dataset included 1355 DB-LGE short-axis images from 144 patients (MIS in 942 images). High performance (> 0.85) as measured by the Intersection over Union metric was obtained for both the LV and MIS segmentations on the training sets. The performance for both LV and MIS segmentations was 0.83 on the test sets.Compared to the 4-Standard Deviation segmentation technique, our system was five times quicker ( <1 min versus 7 +/- 3 min), and required minimal user interaction. Conclusions: Our solution successfully addresses different issues related to automatic MIS segmentation, including accuracy, time-effectiveness, and the automatic generation of a clinical report.(c) 2022 Elsevier B.V. All rights reserved

    Surfing on fitness landscapes: A boost on optimization by fourier surrogate modeling

    Get PDF
    Surfing in rough waters is not always as fun as wave riding the "big one". Similarly, in optimization problems, fitness landscapes with a huge number of local optima make the search for the global optimum a hard and generally annoying game. Computational Intelligence optimization metaheuristics use a set of individuals that "surf" across the fitness landscape, sharing and exploiting pieces of information about local fitness values in a joint effort to find out the global optimum. In this context, we designed surF, a novel surrogate modeling technique that leverages the discrete Fourier transform to generate a smoother, and possibly easier to explore, fitness landscape. The rationale behind this idea is that filtering out the high frequencies of the fitness function and keeping only its partial information (i.e., the low frequencies) can actually be beneficial in the optimization process. We prove our theory by combining surF with a settings free variant of Particle Swarm Optimization (PSO) based on Fuzzy Logic, called Fuzzy Self-Tuning PSO. Specifically, we introduce a new algorithm, named F3ST-PSO, which performs a preliminary exploration on the surrogate model followed by a second optimization using the actual fitness function. We show that F3ST-PSO can lead to improved performances, notably using the same budget of fitness evaluations

    Metodo per determinare un tempo di inversione ottimale per una sequenza “Inversion Recovery” di risonanza magnetica utilizzabile per l’acquisizione di immagini tardive dopo somministrazione di un mezzo di contrasto paramagnetico

    No full text
    Il tempo di inversione (TI) è la misurazione del lasso di tempo tra impulsi di radiofrequenza e statici (di campionamento), necessaria per ottenere il segnale di risonanza magnetica (RM) e rilevare via immagini il rilassamento di un tessuto da esaminare. Ad oggi il TI viene stimato dall'operatore di RM, in base all'esperienza personale o spesso per tentativi e errori, o sottoponendo il paziente a diversi tempi di inversione per confrontarne le immagini acquisite, la cui qualità non è comunque garantita. Tempi e costi di somministrazione del test, qualità e leggibilità delle immagini possono però ora essere migliorati attraverso un modello di machine learning, capace di tararsi sui dati significativi del paziente e sui parametri specifici dell'esame da eseguire. Il modello è stato addestrato su una molteplicità di valutazioni campione in esami del miocardio ed è potenzialmente applicabile ad altri esami di RM

    On the automatic calibration of fully analogical spiking neuromorphic chips

    No full text
    Nowadays, understanding the topology of biological neural networks and sampling their activity is possible thanks to various laboratory protocols that provide a large amount of experimental data, thus paving the way to accurate modeling and simulation. Neuromorphic systems were developed to simulate the dynamics of biological neural networks by means of electronic circuits, offering an efficient alternative to classic simulations based on systems of differential equations, from both the points of view of the energy consumed and the overall computational effort. Spikey is a configurable neuromorphic chip based on the Leaky Integrate-And-Fire model, which gives the user the possibility to model an arbitrary neural topology and simulate the temporal evolution of membrane potentials. To accurately reproduce the behavior of a specific biological network, a detailed parameterization of all neurons in the neuromorphic chip is necessary. Determining such parameters is a hard, error-prone, and generally time consuming task. In this work, we propose a novel methodology for the automatic calibration of neuromorphic chips that exploits a given neural activity as target. Our results show that, in the case of small networks with a low complexity, the method can estimate a vector of parameters capable of reproducing the target activity. Conversely, in the case of more complex networks, the simulations with Spikey can be highly affected by noise, which causes small variations in the simulations outcome even when identical networks are simulated, hindering the convergence to optimal parameterizations

    If You Can't Beat It, Squash It: Simplify Global Optimization by Evolving Dilation Functions

    No full text
    Optimization problems represent a class of pervasive and complex tasks in Computer Science, aimed at identifying the global optimum of a given objective function. Optimization problems are typically noisy, multi-modal, non-convex, non-separable, and often non-differentiable. Because of these features, they mandate the use of sophisticated population-based meta-heuristics to effectively explore the search space. Additionally, computational techniques based on the manipulation of the optimization landscape, such as Dilation Functions (DFs), can be effectively exploited to either “compress” or “dilate” some target regions of the search space, in order to improve the exploration and exploitation capabilities of any meta-heuristic. The main limitation of DFs is that they must be tailored on the specific optimization problem under investigation. In this work, we propose a solution to this issue, based on the idea of evolving the DFs. Specifically, we introduce a two-layered evolutionary framework, which combines Evolutionary Computation and Swarm Intelligence to solve the meta-problem of optimizing both the structure and the parameters of DFs. We evolved optimal DFs on a variety of benchmark problems, showing that this approach yields extremely simpler versions of the original optimization problems

    Local Bubble Dilation Functions: Hypersphere-bounded Landscape Deformations Simplify Global Optimization

    No full text
    Solving optimization problems is one of the most complex and widespread task in Computer Science. In many scenarios, finding the global optimum of a function is hampered by several features that characterize the fitness landscapes, such as noisiness, multi-modality, non-convexity, non-separability, and non-differentiability. In order to facilitate the optimization process, a variety of methods have been proposed to manipulate either the search space or the fitness landscape. Among these, Dilation Functions (DFs) were introduced to expand regions of the search space that are characterized by promising fitness values. In this work, we extend the family of DFs by introducing Local Bubble Dilation Functions (LBDFs), a novel approach that generates local distortions bounded by hyper-spheres. By performing an appropriate mapping of the search space, LBDFs can improve the optimization performance, since they expand and reveal the promising regions around the global optimum, while leaving the rest of the fitness landscape untouched. The additional advantage of LBDFs, with respect to DFs, is that different dilations can be applied to each dimension of the search space, which is useful in the case of asymmetric landscapes. In order to show the benefits of local dilations, we executed several tests on the Michalewicz benchmark function, with different settings for the LBDFs. Our results show that a properly designed LBDF can lead to statistically significant better results than using vanilla optimization. Finally, we investigated the use of LBDFs to facilitate the solution of the parameter estimation problem in Systems Biology by analyzing the landscape related to a stochastic model of enzyme kinetics

    Which random is the best random? A study on sampling methods in Fourier surrogate modeling

    No full text
    Global optimization problems can be effectively solved by means of Computational Intelligence methods. However, there are several areas in which the effectiveness of these algorithms can be hampered by the computational costs of the fitness evaluations, or by specific features of the fitness landscape that can be characterized by noise and by the presence of several (even infinite) local optima. These issues bring about the necessity of defining specific techniques to replace the original problem with a surrogate representation. Fourier surrogate modeling represents a novel and effective approach to generate smoother, and possibly easier to explore, fitness landscapes, and to reduce the computational effort. Fourier surrogates require an initial sampling of the search space that must be performed to calculate the Fourier transforms. In this paper we investigate the impact on the quality of the surrogate models of the hyper-parameters of the methodology, and of several methods that can be employed for the initial sampling of the fitness landscape (i.e., pseudorandom numbers, low discrepancy sequences, a logistic map in chaotic regime, true random positions generated by a quantum computer, and point packing). Our results show that semistructured approaches like quasi-random sequences and point packing can outperform the other sampling methods

    A comparison of multi-objective optimization algorithms to identify drug target combinations

    No full text
    Combination therapies represent one of the most effective strategy in inducing cancer cell death and reducing the risk to develop drug resistance. The identification of putative novel drug combinations, which typically requires the execution of expensive and time consuming lab experiments, can be supported by the synergistic use of mathematical models and multi-objective optimization algorithms. The computational approach allows to automatically search for potential therapeutic combinations and to test their effectiveness in silico, thus reducing the costs of time and money, and driving the experiments toward the most promising therapies. In this work, we couple dynamic fuzzy modeling of cancer cells with different multi-objective optimization algorithm, and we compare their performance in identifying drug target combinations. Specifically, we perform batches of optimizations with 3 and 4 objective functions defined to achieve a desired behavior of the system (e.g., maximize apoptosis while minimizing necrosis and survival), and we compare the quality of the solutions included in the Pareto fronts. Our results show that both the choice of the multi-objective algorithm and the formulation of the optimization problem have an impact on the identified solutions, highlighting the strengths as well as the limitations of this approach

    Large T cell clones expressing immune checkpoints increase during multiple myeloma evolution and predict treatment resistance

    Get PDF
    Abstract Tumor recognition by T cells is essential for antitumor immunity. A comprehensive characterization of T cell diversity may be key to understanding the success of immunomodulatory drugs and failure of PD-1 blockade in tumors such as multiple myeloma (MM). Here, we use single-cell RNA and T cell receptor sequencing to characterize bone marrow T cells from healthy adults (n = 4) and patients with precursor (n = 8) and full-blown MM (n = 10). Large T cell clones from patients with MM expressed multiple immune checkpoints, suggesting a potentially dysfunctional phenotype. Dual targeting of PD-1 + LAG3 or PD-1 + TIGIT partially restored their function in mice with MM. We identify phenotypic hallmarks of large intratumoral T cell clones, and demonstrate that the CD27− and CD27+ T cell ratio, measured by flow cytometry, may serve as a surrogate of clonal T cell expansions and an independent prognostic factor in 543 patients with MM treated with lenalidomide-based treatment combinations
    corecore