204 research outputs found
Abnormality Detection in Mammography using Deep Convolutional Neural Networks
Breast cancer is the most common cancer in women worldwide. The most common
screening technology is mammography. To reduce the cost and workload of
radiologists, we propose a computer aided detection approach for classifying
and localizing calcifications and masses in mammogram images. To improve on
conventional approaches, we apply deep convolutional neural networks (CNN) for
automatic feature learning and classifier building. In computer-aided
mammography, deep CNN classifiers cannot be trained directly on full mammogram
images because of the loss of image details from resizing at input layers.
Instead, our classifiers are trained on labelled image patches and then adapted
to work on full mammogram images for localizing the abnormalities.
State-of-the-art deep convolutional neural networks are compared on their
performance of classifying the abnormalities. Experimental results indicate
that VGGNet receives the best overall accuracy at 92.53\% in classifications.
For localizing abnormalities, ResNet is selected for computing class activation
maps because it is ready to be deployed without structural change or further
training. Our approach demonstrates that deep convolutional neural network
classifiers have remarkable localization capabilities despite no supervision on
the location of abnormalities is provided.Comment: 6 page
Spin gap and magnetic resonance in superconducting BaFeNiAs
We use neutron spectroscopy to determine the nature of the magnetic
excitations in superconducting BaFeNiAs ( K).
Above the excitations are gapless and centered at the commensurate
antiferromagnetic wave vector of the parent compound, while the intensity
exhibits a sinusoidal modulation along the c-axis. As the superconducting state
is entered a spin gap gradually opens, whose magnitude tracks the
-dependence of the superconducting gap observed by angle resolved
photoemission. Both the spin gap and magnetic resonance energies are
temperature \textit{and} wave vector dependent, but their ratio is the same
within uncertainties. These results suggest that the spin resonance is a
singlet-triplet excitation related to electron pairing and superconductivity.Comment: 4 pages, 4 figure
Bending invariant meshes and application to groupwise correspondences
We introduce a new bending invariant representation of a triangular mesh S. The bending invariant mesh X of S is a deformation of S that has the property that the geodesic distance between each pair of vertices on S is approximated well by the Euclidean distance between the corresponding vertices on X. Furthermore, X is intersection-free. The main advantage of the bending invariant mesh compared to previous approaches is that mesh-based features on X can be used to facilitate applications such as shape recognition or shape registration. We apply bending invariant meshes to find dense point-to-point correspondences between a number of deformed surfaces corresponding to different postures of the same non-rigid object in a fully automatic way. 1
Spinal nerve segmentation method and dataset construction in endoscopic surgical scenarios
Endoscopic surgery is currently an important treatment method in the field of
spinal surgery and avoiding damage to the spinal nerves through video guidance
is a key challenge. This paper presents the first real-time segmentation method
for spinal nerves in endoscopic surgery, which provides crucial navigational
information for surgeons. A finely annotated segmentation dataset of
approximately 10,000 consec-utive frames recorded during surgery is constructed
for the first time for this field, addressing the problem of semantic
segmentation. Based on this dataset, we propose FUnet (Frame-Unet), which
achieves state-of-the-art performance by utilizing inter-frame information and
self-attention mechanisms. We also conduct extended exper-iments on a similar
polyp endoscopy video dataset and show that the model has good generalization
ability with advantageous performance. The dataset and code of this work are
presented at: https://github.com/zzzzzzpc/FUnet .Comment: Accepted by MICCAI 202
- …