1,587 research outputs found
An automatic system to discriminate malignant from benign massive lesions in mammograms
Evaluating the degree of malignancy of a massive lesion on the basis of the
mere visual analysis of the mammogram is a non-trivial task. We developed a
semi-automated system for massive-lesion characterization with the aim to
support the radiological diagnosis. A dataset of 226 masses has been used in
the present analysis. The system performances have been evaluated in terms of
the area under the ROC curve, obtaining A_z=0.80+-0.04.Comment: 4 pages, 2 figure; Proceedings of the Frontier Science 2005, 4th
International Conference on Frontier Science, 12-17 September, 2005, Milano,
Ital
Substructure recovery by 3D Discrete Wavelet Transforms
We present and discuss a method to identify substructures in combined
angular-redshift samples of galaxies within Clusters. The method relies on the
use of Discrete Wavelet Transform (hereafter DWT) and has already been applied
to the analysis of the Coma cluster (Gambera et al. 1997). The main new
ingredient of our method with respect to previous studies lies in the fact that
we make use of a 3D data set rather than a 2D. We test the method on mock
cluster catalogs with spatially localized substructures and on a N-body
simulation. Our main conclusion is that our method is able to identify the
existing substructures provided that: a) the subclumps are detached in part or
all of the phase space, b) one has a statistically significant number of
redshifts, increasing as the distance decreases due to redshift distortions; c)
one knows {\it a priori} the scale on which substructures are to be expected.
We have found that to allow an accurate recovery we must have both a
significant number of galaxies ( for clusters at z or
about 800 at z 0.4) and a limiting magnitude for completeness .
The only true limitation to our method seems to be the necessity of knowing
{\it a priori} the scale on which the substructure is to be found. This is an
intrinsic drawback of the method and no improvement in numerical codes based on
this technique could make up for it.Comment: Accepted for publication in MNRAS. 7 pages, 2 figure
Computer-aided detection of pulmonary nodules in low-dose CT
A computer-aided detection (CAD) system for the identification of pulmonary
nodules in low-dose multi-detector helical CT images with 1.25 mm slice
thickness is being developed in the framework of the INFN-supported MAGIC-5
Italian project. The basic modules of our lung-CAD system, a dot enhancement
filter for nodule candidate selection and a voxel-based neural classifier for
false-positive finding reduction, are described. Preliminary results obtained
on the so-far collected database of lung CT scans are discussed.Comment: 3 pages, 4 figures; Proceedings of the CompIMAGE - International
Symposium on Computational Modelling of Objects Represented in Images:
Fundamentals, Methods and Applications, 20-21 Oct. 2006, Coimbra, Portuga
A scalable system for microcalcification cluster automated detection in a distributed mammographic database
A computer-aided detection (CADe) system for microcalcification cluster
identification in mammograms has been developed in the framework of the
EU-founded MammoGrid project. The CADe software is mainly based on wavelet
transforms and artificial neural networks. It is able to identify
microcalcifications in different datasets of mammograms (i.e. acquired with
different machines and settings, digitized with different pitch and bit depth
or direct digital ones). The CADe can be remotely run from GRID-connected
acquisition and annotation stations, supporting clinicians from geographically
distant locations in the interpretation of mammographic data. We report and
discuss the system performances on different datasets of mammograms and the
status of the GRID-enabled CADe analysis.Comment: 6 pages, 4 figures; Proceedings of the IEEE NNS and MIC Conference,
October 23-29, 2005, Puerto Ric
Specificity and entropy reduction in situated referential processing
In situated communication, reference to an entity in the shared visual context can be established using eitheranexpression that conveys precise (minimally specified) or redundant (over-specified) information. There is, however, along-lasting debate in psycholinguistics concerningwhether the latter hinders referential processing. We present evidence from an eyetrackingexperiment recordingfixations as well asthe Index of Cognitive Activity âa novel measure of cognitive workload âsupporting the view that over-specifications facilitate processing. We further present originalevidence that, above and beyond the effect of specificity,referring expressions thatuniformly reduce referential entropyalso benefitprocessin
A Parallel Tree code for large Nbody simulation: dynamic load balance and data distribution on CRAY T3D system
N-body algorithms for long-range unscreened interactions like gravity belong
to a class of highly irregular problems whose optimal solution is a challenging
task for present-day massively parallel computers. In this paper we describe a
strategy for optimal memory and work distribution which we have applied to our
parallel implementation of the Barnes & Hut (1986) recursive tree scheme on a
Cray T3D using the CRAFT programming environment. We have performed a series of
tests to find an " optimal data distribution " in the T3D memory, and to
identify a strategy for the " Dynamic Load Balance " in order to obtain good
performances when running large simulations (more than 10 million particles).
The results of tests show that the step duration depends on two main factors:
the data locality and the T3D network contention. Increasing data locality we
are able to minimize the step duration if the closest bodies (direct
interaction) tend to be located in the same PE local memory (contiguous block
subdivison, high granularity), whereas the tree properties have a fine grain
distribution. In a very large simulation, due to network contention, an
unbalanced load arises. To remedy this we have devised an automatic work
redistribution mechanism which provided a good Dynamic Load Balance at the
price of an insignificant overhead.Comment: 16 pages with 11 figures included, (Latex, elsart.style). Accepted by
Computer Physics Communication
Adherent diamond coatings on cemented tungsten carbide substrates with new Fe/Ni/Co binder phase
WC-Co hard metals continue to gain importance for cutting, mining and chipless forming tools. Cobalt metal currently dominates the market as a binder because of its unique properties. However, the use of cobalt as a binder has several drawbacks related to its hexagonal close-packed structure and market price fluctuations. These issues pushed the development of pre-alloyed binder powders which contain less than 40 wt.% cobalt. In this paper we first report the results of extensive investigations of WC-Fe/Ni/Co hard metal sintering, surface pretreating and deposition of adherent diamond films by using an industrial hot filament chemical vapour deposition (HFCVD) reactor. In particular, CVD diamond was deposited onto WC-Fe/Ni/Co grades which exhibited the best mechanical properties. Prior to deposition, the substrates were submitted to surface roughening by Murakami's etching and to surface binder removal by aqua regia. The adhesion was evaluated by Rockwell indentation tests (20, 40, 60 and 100 kg) conducted with a Brale indenter and compared to the adhesion of diamond films grown onto Co-cemented tungsten carbide substrates, which were submitted to similar etching pretreatments and identical deposition conditions. The results showed that diamond films on medium-grained WC-6 wt.% Fe/Ni/Co substrates exhibited good adhesion levels, comparable to those obtained for HFCVD diamond on Co-cemented carbides with similar microstructure
Rational Redundancy in Referring Expressions: Evidence from Event-related Potentials
In referential communication, Grice's Maxim of Quantity is thought to imply that utterances conveying unnecessary information should incur comprehension difficulties. There is, however, considerable evidence that speakers frequently encode redundant information in their referring expressions, raising the question as to whether such overspecifications hinder listeners' processing. Evidence from previous work is inconclusive, and mostly comes from offline studies. In this article, we present two event-related potential (ERP) experiments, investigating the real-time comprehension of referring expressions that contain redundant adjectives in complex visual contexts. Our findings provide support for both Gricean and bounded-rational accounts. We argue that these seemingly incompatible results can be reconciled if common ground is taken into account. We propose a bounded-rational account of overspecification, according to which even redundant words can be beneficial to comprehension to the extent that they facilitate the reduction of listeners' uncertainty regarding the target referent
Recommended from our members
ERP indices of situated reference in visual contexts
Violations of the maxims of Quantity occur when utterances provide more (over-specified) or less (under-specified) information than strictly required for referent identification. While behavioural datasuggest that under-specified expressions lead to comprehension difficulty and communicative failure, there is no consensus as to whether over-specified expressions are also detrimental to comprehension. In this study we shed light on this debate, providing neurophysiological evidence supporting the view that extra information facilitates comprehension. We further present novel evidence that referential failure due to under-specification is qualitatively different from explicit cases of referential failure, when no matching referential candidate is available in the context
- âŠ