50 research outputs found

    Enhanced algorithms for lesion detection and recognition in ultrasound breast images

    Get PDF
    Mammography is the gold standard for breast cancer detection. However, it has very high false positive rates and is based on ionizing radiation. This has led to interest in using multi-modal approaches. One modality is diagnostic ultrasound, which is based on non-ionizing radiation and picks up many of the cancers that are generally missed by mammography. However, the presence of speckle noise in ultrasound images has a negative effect on image interpretation. Noise reduction, inconsistencies in capture and segmentation of lesions still remain challenging open research problems in ultrasound images. The target of the proposed research is to enhance the state-of-art computer vision algorithms used in ultrasound imaging and to investigate the role of computer processed images in human diagnostic performance. [Continues.

    A lightweight, graph-theoretic model of class-based similarity to support object-oriented code reuse.

    Get PDF
    The work presented in this thesis is principally concerned with the development of a method and set of tools designed to support the identification of class-based similarity in collections of object-oriented code. Attention is focused on enhancing the potential for software reuse in situations where a reuse process is either absent or informal, and the characteristics of the organisation are unsuitable, or resources unavailable, to promote and sustain a systematic approach to reuse. The approach builds on the definition of a formal, attributed, relational model that captures the inherent structure of class-based, object-oriented code. Based on code-level analysis, it relies solely on the structural characteristics of the code and the peculiarly object-oriented features of the class as an organising principle: classes, those entities comprising a class, and the intra and inter-class relationships existing between them, are significant factors in defining a two-phase similarity measure as a basis for the comparison process. Established graph-theoretic techniques are adapted and applied via this model to the problem of determining similarity between classes. This thesis illustrates a successful transfer of techniques from the domains of molecular chemistry and computer vision. Both domains provide an existing template for the analysis and comparison of structures as graphs. The inspiration for representing classes as attributed relational graphs, and the application of graph-theoretic techniques and algorithms to their comparison, arose out of a well-founded intuition that a common basis in graph-theory was sufficient to enable a reasonable transfer of these techniques to the problem of determining similarity in object-oriented code. The practical application of this work relates to the identification and indexing of instances of recurring, class-based, common structure present in established and evolving collections of object-oriented code. A classification so generated additionally provides a framework for class-based matching over an existing code-base, both from the perspective of newly introduced classes, and search "templates" provided by those incomplete, iteratively constructed and refined classes associated with current and on-going development. The tools and techniques developed here provide support for enabling and improving shared awareness of reuse opportunity, based on analysing structural similarity in past and ongoing development, tools and techniques that can in turn be seen as part of a process of domain analysis, capable of stimulating the evolution of a systematic reuse ethic

    Tau Neutrinos in the Next Decade: from GeV to EeV

    Get PDF

    Connected Attribute Filtering Based on Contour Smoothness

    Get PDF

    Novel Signal Reconstruction Techniques in Cyclotron Radiation Emission Spectroscopy for Neutrino Mass Measurement

    Get PDF
    The Project 8 experiment is developing Cyclotron Radiation Emission Spectroscopy (CRES) on the beta-decay spectrum of tritium for the measurement of the absolute neutrino mass scale. CRES is a frequency-based technique which aims to probe the endpoint of the tritium energy spectrum with a final target sensitivity of 0.04 eV, pushing the limits beyond the inverted mass hierarchy. A phased-approach experiment, both Phase I and Phase II efforts use a combination of 83mKr and molecular tritium T_2 as source gases. The technique relies on an accurate, precise, and well-understood reconstructed beta-spectrum whose endpoint and spectral shape near the endpoint may be constrained by a kinematical model which uses the neutrino mass m_beta as a free parameter. Since the decays in the last eV of the tritium spectrum encompass O(10^(-13)) of all decays and the precise variation of the spectrum, distorted by the presence of a massive neutrino, is fundamental to the measurement, reconstruction techniques which yield accurate measurements of the frequency (and therefore energy) of the signal and correctly classify signal from background are necessary. In this work, we discuss the open-problem of the absolute neutrino mass scale, the fundamentals of measurements tailored to resolve this, the underpinning and details of the CRES technology, and the measurement of the first-ever CRES tritium β\beta-spectrum. Finally, we focus on novel reconstruction techniques at both the signal and event levels using machine learning algorithms that allow us to adapt our technique to the complex dynamics of the electron inside our detector. We will show that such methods can separate true events from backgrounds at \u3e 94% accuracy and are able to improve the efficiency of reconstruction when compared to traditional reconstruction methods by \u3e 23%

    Wheat Yield Assessment Using In-Field Organ-Scale Phenotyping and Deep Learning Methods

    Full text link
    Phenwhea
    corecore