6,822 research outputs found

    Inferior Alveolar Canal Automatic Detection with Deep Learning CNNs on CBCTs: Development of a Novel Model and Release of Open-Source Dataset and Algorithm

    Get PDF
    Featured Application Convolutional neural networks can accurately identify the Inferior Alveolar Canal, rapidly generating precise 3D data. The datasets and source code used in this paper are publicly available, allowing the reproducibility of the experiments performed. Introduction: The need of accurate three-dimensional data of anatomical structures is increasing in the surgical field. The development of convolutional neural networks (CNNs) has been helping to fill this gap by trying to provide efficient tools to clinicians. Nonetheless, the lack of a fully accessible datasets and open-source algorithms is slowing the improvements in this field. In this paper, we focus on the fully automatic segmentation of the Inferior Alveolar Canal (IAC), which is of immense interest in the dental and maxillo-facial surgeries. Conventionally, only a bidimensional annotation of the IAC is used in common clinical practice. A reliable convolutional neural network (CNNs) might be timesaving in daily practice and improve the quality of assistance. Materials and methods: Cone Beam Computed Tomography (CBCT) volumes obtained from a single radiological center using the same machine were gathered and annotated. The course of the IAC was annotated on the CBCT volumes. A secondary dataset with sparse annotations and a primary dataset with both dense and sparse annotations were generated. Three separate experiments were conducted in order to evaluate the CNN. The IoU and Dice scores of every experiment were recorded as the primary endpoint, while the time needed to achieve the annotation was assessed as the secondary end-point. Results: A total of 347 CBCT volumes were collected, then divided into primary and secondary datasets. Among the three experiments, an IoU score of 0.64 and a Dice score of 0.79 were obtained thanks to the pre-training of the CNN on the secondary dataset and the creation of a novel deep label propagation model, followed by proper training on the primary dataset. To the best of our knowledge, these results are the best ever published in the segmentation of the IAC. The datasets is publicly available and algorithm is published as open-source software. On average, the CNN could produce a 3D annotation of the IAC in 6.33 s, compared to 87.3 s needed by the radiology technician to produce a bidimensional annotation. Conclusions: To resume, the following achievements have been reached. A new state of the art in terms of Dice score was achieved, overcoming the threshold commonly considered of 0.75 for the use in clinical practice. The CNN could fully automatically produce accurate three-dimensional segmentation of the IAC in a rapid setting, compared to the bidimensional annotations commonly used in the clinical practice and generated in a time-consuming manner. We introduced our innovative deep label propagation method to optimize the performance of the CNN in the segmentation of the IAC. For the first time in this field, the datasets and the source codes used were publicly released, granting reproducibility of the experiments and helping in the improvement of IAC segmentation

    Computational Geometry Contributions Applied to Additive Manufacturing

    Get PDF
    This Doctoral Thesis develops novel articulations of Computation Geometry for applications on Additive Manufacturing, as follows: (1) Shape Optimization in Lattice Structures. Implementation and sensitivity analysis of the SIMP (Solid Isotropic Material with Penalization) topology optimization strategy. Implementation of a method to transform density maps, resulting from topology optimization, into surface lattice structures. Procedure to integrate material homogenization and Design of Experiments (DOE) to estimate the stress/strain response of large surface lattice domains. (2) Simulation of Laser Metal Deposition. Finite Element Method implementation of a 2D nonlinear thermal model of the Laser Metal Deposition (LMD) process considering temperaturedependent material properties, phase change and radiation. Finite Element Method implementation of a 2D linear transient thermal model for a metal substrate that is heated by the action of a laser. (3) Process Planning for Laser Metal Deposition. Implementation of a 2.5D path planning method for Laser Metal Deposition. Conceptualization of a workflow for the synthesis of the Reeb Graph for a solid region in â„ť" denoted by its Boundary Representation (B-Rep). Implementation of a voxel-based geometric simulator for LMD process. Conceptualization, implementation, and validation of a tool for the minimization of the material over-deposition at corners in LMD. Implementation of a 3D (non-planar) slicing and path planning method for the LMD-manufacturing of overhanging features in revolute workpieces. The aforementioned contributions have been screened by the international scientific community via Journal and Conference submissions and publications

    Modelling, Monitoring, Control and Optimization for Complex Industrial Processes

    Get PDF
    This reprint includes 22 research papers and an editorial, collected from the Special Issue "Modelling, Monitoring, Control and Optimization for Complex Industrial Processes", highlighting recent research advances and emerging research directions in complex industrial processes. This reprint aims to promote the research field and benefit the readers from both academic communities and industrial sectors

    Exploring QCD matter in extreme conditions with Machine Learning

    Full text link
    In recent years, machine learning has emerged as a powerful computational tool and novel problem-solving perspective for physics, offering new avenues for studying strongly interacting QCD matter properties under extreme conditions. This review article aims to provide an overview of the current state of this intersection of fields, focusing on the application of machine learning to theoretical studies in high energy nuclear physics. It covers diverse aspects, including heavy ion collisions, lattice field theory, and neutron stars, and discuss how machine learning can be used to explore and facilitate the physics goals of understanding QCD matter. The review also provides a commonality overview from a methodology perspective, from data-driven perspective to physics-driven perspective. We conclude by discussing the challenges and future prospects of machine learning applications in high energy nuclear physics, also underscoring the importance of incorporating physics priors into the purely data-driven learning toolbox. This review highlights the critical role of machine learning as a valuable computational paradigm for advancing physics exploration in high energy nuclear physics.Comment: 146 pages,53 figure

    Affinity-Based Reinforcement Learning : A New Paradigm for Agent Interpretability

    Get PDF
    The steady increase in complexity of reinforcement learning (RL) algorithms is accompanied by a corresponding increase in opacity that obfuscates insights into their devised strategies. Methods in explainable artificial intelligence seek to mitigate this opacity by either creating transparent algorithms or extracting explanations post hoc. A third category exists that allows the developer to affect what agents learn: constrained RL has been used in safety-critical applications and prohibits agents from visiting certain states; preference-based RL agents have been used in robotics applications and learn state-action preferences instead of traditional reward functions. We propose a new affinity-based RL paradigm in which agents learn strategies that are partially decoupled from reward functions. Unlike entropy regularisation, we regularise the objective function with a distinct action distribution that represents a desired behaviour; we encourage the agent to act according to a prior while learning to maximise rewards. The result is an inherently interpretable agent that solves problems with an intrinsic affinity for certain actions. We demonstrate the utility of our method in a financial application: we learn continuous time-variant compositions of prototypical policies, each interpretable by its action affinities, that are globally interpretable according to customers’ financial personalities. Our method combines advantages from both constrained RL and preferencebased RL: it retains the reward function but generalises the policy to match a defined behaviour, thus avoiding problems such as reward shaping and hacking. Unlike Boolean task composition, our method is a fuzzy superposition of different prototypical strategies to arrive at a more complex, yet interpretable, strategy.publishedVersio

    Ambiguous Medical Image Segmentation using Diffusion Models

    Full text link
    Collective insights from a group of experts have always proven to outperform an individual's best diagnostic for clinical tasks. For the task of medical image segmentation, existing research on AI-based alternatives focuses more on developing models that can imitate the best individual rather than harnessing the power of expert groups. In this paper, we introduce a single diffusion model-based approach that produces multiple plausible outputs by learning a distribution over group insights. Our proposed model generates a distribution of segmentation masks by leveraging the inherent stochastic sampling process of diffusion using only minimal additional learning. We demonstrate on three different medical image modalities- CT, ultrasound, and MRI that our model is capable of producing several possible variants while capturing the frequencies of their occurrences. Comprehensive results show that our proposed approach outperforms existing state-of-the-art ambiguous segmentation networks in terms of accuracy while preserving naturally occurring variation. We also propose a new metric to evaluate the diversity as well as the accuracy of segmentation predictions that aligns with the interest of clinical practice of collective insights

    Improving sensor placement optimisation robustness to environmental variations and sensor failures for structural health monitoring systems

    Get PDF
    The installation of structural health monitoring (SHM) systems based on machine learning algorithms on structures has been of constant interest. The application of this kind of SHM system can facilitate decisions regarding maintenance and the remaining useful life of the structure in a more automatic and convenient way. As part of the SHM system to collect information, the sensor system can be optimally designed to improve the performance of the final system. In this thesis, the work focuses on how to consider the effects of environment and sensor failures during the sensor placement optimisation (SPO) to build a more robust and effective monitoring system. Since the availability of data during the design phase varies widely from project to project and there are no studies or specifications that provide specific guidance, not much research has been done on the design of such sensor systems, which require reliable simulated or measured data to be available during the design phase. Considering the different levels of data accessibility at the design phase, this thesis proposes a series of strategies for the optimal design of sensor systems for SHM systems from a machine-learning perspective. The first main content of this thesis is hierarchical assessment criteria of designed-system performance to balance the computational feasibility and visualisation of the final system performance. At the stage after data is collected, machine learning model results are often used as a criterion, whose acquisition is usually time-consuming. At the same time, higher data accuracy is required. Therefore, the criteria used in the design of sensor systems are divided into different tiers. The criteria for the initial stage can be abstracted from the purpose of the applied machine learning model to significantly reduce the number of candidate designs. The criteria for the final stage can be similar to those used in the stage after data is collected. Whether or not to use criteria from all tiers depends on the level of data availability. It can be found that more work on the optimisation design of the sensor system can be done at the initial stage of the hierarchical design framework. Therefore, the other three main contents of this thesis are developed at this stage. Considering different levels of data availability, supervised and unsupervised correlation-based strategies to evaluate sensor combinations are proposed, including the evaluation criterion and the fast calculation methods of this criterion. Sensor combinations can be ranked even if only the healthy state data are accessible. To account for the effects of environmental variations, two SPO strategies based on approaches to extracting robust features are proposed, and an appropriate criterion that can be used is also introduced. These two strategies cover both situations where environmental change information is available and not. To consider the sensor-failure effect in the SPO process, another two strategies, namely fail-safe sensor optimisation or fail-safe optimisation with redundancy, are proposed in this thesis, both of which can take into account the performance of the designed system before and after the failure of some critical sensors. Different assessment criteria are adopted to demonstrate the generality of these strategies

    Enhancing Measurements of the CMB Blackbody Temperature Power Spectrum by Removing CIB and Thermal Sunyaev-Zel'dovich Contamination Using External Galaxy Catalogs

    Full text link
    Extracting the CMB blackbody temperature power spectrum -- which is dominated by the primary CMB signal and the kinematic Sunyaev-Zel'dovich (kSZ) effect -- from mm-wave sky maps requires cleaning other sky components. In this work, we develop new methods to use large-scale structure (LSS) tracers to remove cosmic infrared background (CIB) and thermal Sunyaev-Zel'dovich (tSZ) contamination in such measurements. Our methods rely on the fact that LSS tracers are correlated with the CIB and tSZ signals, but their two-point correlations with the CMB and kSZ signals vanish on small scales, thus leaving the CMB blackbody power spectrum unbiased after cleaning. We develop methods analogous to delensing (de-CIB\textit{de-CIB} or de-(CIB+tSZ)\textit{de-(CIB+tSZ)}) to clean CIB and tSZ contaminants using these tracers. We compare these methods to internal linear combination (ILC) methods, including novel approaches that incorporate the tracer maps in the ILC procedure itself, without requiring exact assumptions about the CIB SED. As a concrete example, we use the unWISE\textit{unWISE} galaxy samples as tracers. We provide calculations for a combined Simons Observatory and Planck\textit{Planck}-like experiment, with our simulated sky model comprising eight frequencies from 93 to 353 GHz. Using unWISE\textit{unWISE} tracers, improvements with our methods over current approaches are already non-negligible: we find improvements up to 20% in the kSZ power spectrum signal-to-noise ratio (SNR) when applying the de-CIB method to a tSZ-deprojected ILC map. These gains could be more significant when using additional LSS tracers from current surveys, and will become even larger with future LSS surveys, with improvements in the kSZ power spectrum SNR up to 50%. For the total CMB blackbody power spectrum, these improvements stand at 4% and 7%, respectively. Our code is publicly available at https://github.com/olakusiak/deCIBing.Comment: 35+21 pages, 20+11 figures; code is available at https://github.com/olakusiak/deCIBin

    Developing novel measures and treatments for gambling disorder

    Get PDF
    Background: While gambling is an activity that seems to have entertained humanity for millennia, it is less clear why problematic gambling behavior may persist despite obvious negative consequences, from a research and clinical perspective. With the introduction of the 5th edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM–5), gambling was equated with alcohol and drug use and labeled an addictive disorder, Gambling Disorder (GD). Problem gambling is associated with destroyed careers, broken marriages, financial ruin, and psychiatric comorbidities. Still, research on gambling can be described as a field still in its infancy, with a need to conduct further gambling research on measurement and treatment procedures. Aims: The overall aim for the thesis was to develop and evaluate measures and treatments for Gambling Disorder. • The aims of Study I were to reach a consensus regarding a specific set of potential new measurement items, to yield a testable draft version of a new gambling measure, and to establish preliminary construct and face validity for this novel gambling measure, the Gambling Disorder Identification Test (GDIT). • The aim of Study II was to evaluate psychometric properties (e.g., internal consistency and test-retest reliability, factor structure, convergent and discriminant validity, as well as diagnostic accuracy) of the GDIT, among treatment- and support-seeking samples (n = 79 and n = 185), self-help groups (n = 47), and a population sample (n = 292). • The aim of Study III was to formulate hypotheses on the maintenance of GD by identifying clinically relevant behaviors at an individual level, among six treatmentseeking participants with GD. This qualitative study was conducted as a preparatory step to develop the iCBTG (see Study IV). • The aim of Study IV was to evaluate acceptability and clinical effectiveness of the newly developed iCBTG, among treatment seeking-patients with GD (n = 23) in routine care. A further aim was to evaluate research feasibility of using existing healthcare infrastructure to deliver the iCBTG program. Methods: In Study I, gambling experts from ten countries rated 30 items proposed for inclusion in the GDIT, in a two-round Delphi (n = 61; n = 30). Three following consensus meetings including gambling researchers and clinicians (n = 10; n = 4; n = 3), were held to solve item-related issues and establish a GDIT draft version. To evaluate face validity, the GDIT draft version was presented to individuals with experience of problem gambling (n = 12) and to treatment-seeker participants with Gambling Disorder (n = 8). In Study II, the psychometric properties of the GDIT were evaluated among gamblers (N = 603), recruited from treatment- and support-seeking contexts (n = 79; n = 185), self-help groups (n = 47), and a population sample (n = 292). The participants completed self-report measures, a GDIT retest (n = 499) and a diagnostic semi- structured interview assessing GD (n = 203). In Study III, treatment-seeking patients with GD and various additional psychiatric symptom profiles (n = 6), were interviewed using an in-depth semi-structured functional interview. Participants also completed self-report measures assessing gambling behavior. A qualitative thematic analysis was performed using functional analysis as a theoretical framework. Following completion of Study III, the results were synthesized with existing experimental evidence on gambling behavior and used to develop the novel treatment model and internet-delivered treatment evaluated in Study IV, i.e., the iCBTG. In Study IV, a non-randomized preliminary evaluation of the novel iCBTG was conducted in parallel with implementation into routine addiction care, through the Support and Treatment platform (St d och behandlingsplattformen; ST platform). Feasibility was evaluated among a sample of treatment-seeking patients (N = 23), in terms of iCBTG adherence, acceptability and clinical effectiveness, and feasibility of using existing healthcare infrastructure for clinical delivery as well as research purposes. Results: Study I established preliminary face validity for the GDIT, as well as construct validity in relation to a researcher agreement from 2006 on measuring problem gambling, known as the Banff consensus. Study II showed excellent internal consistency reliability (α = .94) and test–retest reliability (6-16 days, intraclass correlation coefficient = 0.93) for the GDIT. Confirmatory factor analysis yielded factor loadings supporting the three proposed GDIT domains of gambling behavior, gambling symptoms, and negative consequences. Receiver operating characteristic curves (ROC) and clinical significance estimates were used to establish GDIT cut-off scores for recreational gambling (<15), problem gambling (15-19), and GD (any ≥20; mild 20-24; moderate 25-29; and severe ≥30). Study III yielded several functional categories for gambling behavior, as well as four main processes potentially important for treatment, i.e., access to money, anticipation, selective attention (focus) and chasing behaviors. Study IV showed that patient engagement in the iCBTG modules was comparable to previous internet-delivered cognitive behavioral treatment trials in the general population. The iCBTG was rated satisfactory in treatment credibility, expectancy, and satisfaction. Mixed effects modeling revealed a significant decrease in gambling symptoms during treatment (within-group effect size d=1.05 at follow-up), which correlated with changes in loss of control (in the expected direction of increased control). However, measurement issues related to the ST platform were also identified, which led to significant attrition in several measures. Conclusions: GDIT is a reliable and valid measure to assess GD and problem gambling. In addition, GDIT demonstrates high content validity relation to the Banff consensus. The iCBTG was developed to achieve a theoretically grounded and meaningful treatment model for GD. Preliminary estimates support acceptability and clinical effectiveness in “real world” settings, but further randomized controlled studies are warranted to ensure treatment efficacy

    Malaria in travellers and migrants

    Get PDF
    Malaria is a potentially fatal disease that caused approximately 241 million cases and 627 000 deaths in 2020, most in children in Sub-Saharan Africa. In non-endemic countries, malaria is imported by travellers and migrants and timely management and treatment is crucial. Since mild episodes can progress to severe malaria with vital organ failure it is important to identify patients at high risk. Severe malaria is defined by the World Health Organization (WHO) and intravenous treatment is recommended for patients fulfilling any of the criteria for severity. It is, however, not clear if these criteria and recommendations are optimal in non-endemic countries. Moreover, management of malaria in migrants may also need to consider persistent low-density infections with no or discrete symptoms that may have negative health effects. The aim of this thesis was to contribute to improved identification, treatment, and management of malaria in travellers and migrants In Study I, we describe the epidemiology and severity of imported malaria in travellers and migrants in a nationwide study in Sweden (n=2653). In P. falciparum, young and older age, patient origin in a non-endemic country, health care delay, pregnancy and HIV-infection were identified risk factors for severe disease. Oral treatment of P. falciparum episodes with parasitemia >2% increased the risk of progression to severe malaria. In P. vivax, a high proportion of severe malaria was seen in newly arrived migrants. In Study II, we studied relapses of P. vivax and P. ovale after treatment in a non-endemic setting. The risk of relapse was substantially higher for P. vivax compared to P. ovale. Primaquine significantly reduced the risk of relapse in P. vivax, however, in P. ovale, relapses were rare and the effect of primaquine was less evident. In Study III, we assessed how the WHO criteria for severe malaria reflect disease severity in terms of death or need of prolonged intensive care in adults in a non-endemic setting. Overall, the WHO criteria had high sensitivity and specificity for the unfavourable outcome also compared to other scoring systems. The predicting ability was improved when using only three criteria: cerebral impairment (GCS ≤14 or multiple convulsions), ≥2.5% P. falciparum parasitemia or respiratory distress (respiratory rate >30/min, or acidotic breathing). In Study IV, the prevalence of malaria parasites was assessed in migrants from Sub-Saharan Africa residing in Sweden. The overall asymptomatic parasite prevalence was 8%, by PCR. However, in migrants arriving from Uganda the prevalence was higher, and especially in children where over 30% were parasite positive by PCR, and often detected in family members. The longest duration of residency in Sweden at sampling among PCR positives was 386 days for P. falciparum. In conclusion, severe malaria occurred in all species and patients at high risk for severe outcome can be identified with new simple criteria and should be recommended intravenous treatment. Screening for malaria parasites should be considered as part of the health screening offered to migrants
    • …
    corecore