37,319 research outputs found
Intima-Media Thickness: Setting a Standard for a Completely Automated Method of Ultrasound Measurement
The intima - media thickness (IMT) of the common carotid artery is a widely used clinical marker of severe cardiovascular diseases. IMT is usually manually measured on longitudinal B-Mode ultrasound images. Many computer-based techniques for IMT measurement have been proposed to overcome the limits of manual segmentation. Most of these, however, require a certain degree of user interaction. In this paper we describe a new completely automated layers extraction (CALEXia) technique for the segmentation and IMT measurement of carotid wall in ultrasound images. CALEXia is based on an integrated approach consisting of feature extraction, line fitting, and classification that enables the automated tracing of the carotid adventitial walls. IMT is then measured by relying on a fuzzy K-means classifier. We tested CALEXia on a database of 200 images. We compared CALEXia performances to those of a previously developed methodology that was based on signal analysis (CULEXsa). Three trained operators manually segmented the images and the average profiles were considered as the ground truth. The average error from CALEXia for lumen - intima (LI) and media - adventitia (MA) interface tracings were 1.46 ± 1.51 pixel (0.091 ± 0.093 mm) and 0.40 ± 0.87 pixel (0.025 ± 0.055 mm), respectively. The corresponding errors for CULEXsa were 0.55 ± 0.51 pixels (0.035 ± 0.032 mm) and 0.59 ± 0.46 pixels (0.037 ± 0.029 mm). The IMT measurement error was equal to 0.87 ± 0.56 pixel (0.054 ± 0.035 mm) for CALEXia and 0.12 ± 0.14 pixel (0.01 ± 0.01 mm) for CULEXsa. Thus, CALEXia showed limited performance in segmenting the LI interface, but outperformed CULEXsa in the MA interface and in the number of images correctly processed (10 for CALEXia and 16 for CULEXsa). Based on two complementary strategies, we anticipate fusing them for further IMT improvement
Improving information filtering via network manipulation
Recommender system is a very promising way to address the problem of
overabundant information for online users. Though the information filtering for
the online commercial systems received much attention recently, almost all of
the previous works are dedicated to design new algorithms and consider the
user-item bipartite networks as given and constant information. However, many
problems for recommender systems such as the cold-start problem (i.e. low
recommendation accuracy for the small degree items) are actually due to the
limitation of the underlying user-item bipartite networks. In this letter, we
propose a strategy to enhance the performance of the already existing
recommendation algorithms by directly manipulating the user-item bipartite
networks, namely adding some virtual connections to the networks. Numerical
analyses on two benchmark data sets, MovieLens and Netflix, show that our
method can remarkably improve the recommendation performance. Specifically, it
not only improve the recommendations accuracy (especially for the small degree
items), but also help the recommender systems generate more diverse and novel
recommendations.Comment: 6 pages, 5 figure
Identifying Proteins of High Designability via Surface-Exposure Patterns
Using an off-lattice model, we fully enumerate folded conformations of
polypeptide chains of up to N = 19 monomers. Structures are found to differ
markedly in designability, defined as the number of sequences with that
structure as a unique lowest-energy conformation. We find that designability is
closely correlated with the pattern of surface exposure of the folded
structure. For longer chains, complete enumeration of structures is
impractical. Instead, structures can be randomly sampled, and relative
designability estimated either from designability within the random sample, or
directly from surface-exposure pattern. We compare the surface-exposure
patterns of those structures identified as highly designable to the patterns of
naturally occurring proteins.Comment: 17 pages, 12 figure
Recommended from our members
Effects of Surface Roughness on the Electrochemical Reduction of CO2 over Cu
We have investigated the role of surface roughening on the CO2 reduction reaction (CO2RR) over Cu. The activity and product selectivity of Cu surfaces roughened by plasma pretreatment in Ar, O2, or N2 were compared with that of electrochemically polished Cu samples. Differences in total and product current densities, the ratio of current densities for HER (the hydrogen evolution reaction) to CO2RR, and the ratio of current densities for C2+ to C1 products depend on the electrochemically active surface and are nearly independent of plasma composition. Theoretical analysis of an electropolished and roughened Cu surface reveals a higher fraction of undercoordinated Cu sites on the roughened surface, sites that bind CO preferentially. Roughened surfaces also contain square sites similar to those on a Cu(100) surface but with neighboring step sites, which adsorb OC-COH, a precursor to C2+ products. These findings explain the increases in the formation of oxygenates and hydrocarbons relative to CO and the ratio of oxygenates to hydrocarbons observed with increasing surface roughness
Transition Form Factor with Tensor Current within the Factorization Approach
In the paper, we apply the factorization approach to deal with the
transition form factor with tensor current in the large recoil
regions. Main uncertainties for the estimation are discussed and we obtain
, where the first error is caused by the
uncertainties from the pionic wave functions and the second is from that of the
B-meson wave functions. This result is consistent with the light-cone sum rule
results obtained in the literature.Comment: 8 pages, 4 figures, references adde
The Distribution of the Asymptotic Number of Citations to Sets of Publications by a Researcher or From an Academic Department Are Consistent With a Discrete Lognormal Model
How to quantify the impact of a researcher's or an institution's body of work
is a matter of increasing importance to scientists, funding agencies, and
hiring committees. The use of bibliometric indicators, such as the h-index or
the Journal Impact Factor, have become widespread despite their known
limitations. We argue that most existing bibliometric indicators are
inconsistent, biased, and, worst of all, susceptible to manipulation. Here, we
pursue a principled approach to the development of an indicator to quantify the
scientific impact of both individual researchers and research institutions
grounded on the functional form of the distribution of the asymptotic number of
citations. We validate our approach using the publication records of 1,283
researchers from seven scientific and engineering disciplines and the chemistry
departments at the 106 U.S. research institutions classified as "very high
research activity". Our approach has three distinct advantages. First, it
accurately captures the overall scientific impact of researchers at all career
stages, as measured by asymptotic citation counts. Second, unlike other
measures, our indicator is resistant to manipulation and rewards publication
quality over quantity. Third, our approach captures the time-evolution of the
scientific impact of research institutions.Comment: 20 pages, 11 figures, 3 table
- …