223 research outputs found
Recommended from our members
Cystatin C deficiency in human atherosclerosis and aortic aneurysms
The pathogenesis of atherosclerosis and abdominal aortic aneurysm involves breakdown of the elastic laminae. Elastolytic cysteine proteases, including cathepsins S and K, are overexpressed at sites of arterial elastin damage, but whether endogenous local inhibitors counterbalance these proteases is unknown. We show here that, whereas cystatin C is normally expressed in vascular wall smooth muscle cells (SMCs), this cysteine protease inhibitor is severely reduced in both atherosclerotic and aneurysmal aortic lesions. Furthermore, increased abdominal aortic diameter among 122 patients screened by ultrasonography correlated inversely with serum cystatin C levels. In vitro, cytokine-stimulated vascular SMCs secrete cathepsins, whose elastolytic activity could be blocked when cystatin C secretion was induced by treatment with TGF-ÎČ1. The findings highlight a potentially important role for imbalance between cysteine proteases and cystatin C in arterial wall remodeling and establish that cystatin C deficiency occurs in vascular disease
Recommended from our members
February 1967 Conference Issue
Massachusetts Turf and Lawn Grass CouncilBetter Turf Through Research and Educatio
The Medical Segmentation Decathlon
International challenges have become the de facto standard for comparative assessment of image analysis algorithms. Although segmentation is the most widely investigated medical image processing task, the various challenges have been organized to focus only on specific clinical tasks. We organized the Medical Segmentation Decathlon (MSD)âa biomedical image analysis challenge, in which algorithms compete in a multitude of both tasks and modalities to investigate the hypothesis that a method capable of performing well on multiple tasks will generalize well to a previously unseen task and potentially outperform a custom-designed solution. MSD results confirmed this hypothesis, moreover, MSD winner continued generalizing well to a wide range of other clinical problems for the next two years. Three main conclusions can be drawn from this study: (1) state-of-the-art image segmentation algorithms generalize well when retrained on unseen tasks; (2) consistent algorithmic performance across multiple tasks is a strong surrogate of algorithmic generalizability; (3) the training of accurate AI segmentation models is now commoditized to scientists that are not versed in AI model training
Vilhelm Lundstedtâs âLegal Machineryâ and the Demise of Juristic Practice
This article aims to contribute to the academic debate on the general crisis faced by law schools and the legal professions by discussing why juristic practice is a matter of experience rather than knowledge. Through a critical contextualisation of Vilhelm Lundstedtâs thought under processes of globalisation and transnationalism, it is argued that the demise of the juristâs function is related to lawâs scientification as brought about by the metaphysical construction of reality. The suggested roadmap will in turn reveal that the current voiding of juristic practice and its teaching is part of the crisis regarding what makes us human
Gatecliff Shelter
552 p. : ill., maps ; 26 cm.Includes bibliographical references (p. 530-552)
The Medical Segmentation Decathlon
International challenges have become the de facto standard for comparative
assessment of image analysis algorithms given a specific task. Segmentation is
so far the most widely investigated medical image processing task, but the
various segmentation challenges have typically been organized in isolation,
such that algorithm development was driven by the need to tackle a single
specific clinical problem. We hypothesized that a method capable of performing
well on multiple tasks will generalize well to a previously unseen task and
potentially outperform a custom-designed solution. To investigate the
hypothesis, we organized the Medical Segmentation Decathlon (MSD) - a
biomedical image analysis challenge, in which algorithms compete in a multitude
of both tasks and modalities. The underlying data set was designed to explore
the axis of difficulties typically encountered when dealing with medical
images, such as small data sets, unbalanced labels, multi-site data and small
objects. The MSD challenge confirmed that algorithms with a consistent good
performance on a set of tasks preserved their good average performance on a
different set of previously unseen tasks. Moreover, by monitoring the MSD
winner for two years, we found that this algorithm continued generalizing well
to a wide range of other clinical problems, further confirming our hypothesis.
Three main conclusions can be drawn from this study: (1) state-of-the-art image
segmentation algorithms are mature, accurate, and generalize well when
retrained on unseen tasks; (2) consistent algorithmic performance across
multiple tasks is a strong surrogate of algorithmic generalizability; (3) the
training of accurate AI segmentation models is now commoditized to non AI
experts
All-flavor constraints on nonstandard neutrino interactions and generalized matter potential with three years of IceCube DeepCore data
We report constraints on nonstandard neutrino interactions (NSI) from the observation of atmospheric neutrinos with IceCube, limiting all individual coupling strengths from a single dataset. Furthermore, IceCube is the first experiment to constrain flavor-violating and nonuniversal couplings simultaneously. Hypothetical NSI are generically expected to arise due to the exchange of a new heavy mediator particle. Neutrinos propagating in matter scatter off fermions in the forward direction with negligible momentum transfer. Hence the study of the matter effect on neutrinos propagating in the Earth is sensitive to NSI independently of the energy scale of new physics. We present constraints on NSI obtained with an all-flavor event sample of atmospheric neutrinos based on three years of IceCube DeepCore data. The analysis uses neutrinos arriving from all directions, with reconstructed energies between 5.6 GeV and 100 GeV. We report constraints on the individual NSI coupling strengths considered singly, allowing for complex phases in the case of flavor-violating couplings. This demonstrates that IceCube is sensitive to the full NSI flavor structure at a level competitive with limits from the global analysis of all other experiments. In addition, we investigate a generalized matter potential, whose overall scale and flavor structure are also constrained
Searching for High-energy Neutrino Emission from Galaxy Clusters with IceCube
Galaxy clusters have the potential to accelerate cosmic rays (CRs) to ultrahigh energies via accretion shocks or embedded CR acceleration sites. The CRs with energies below the Hillas condition will be confined within the cluster and eventually interact with the intracluster medium gas to produce secondary neutrinos and gamma rays. Using 9.5 yr of muon neutrino track events from the IceCube Neutrino Observatory, we report the results of a stacking analysis of 1094 galaxy clusters with masses âł10 Mâ and redshifts between 0.01 and âŒ1 detected by the Planck mission via the SunyaevâZelâdovich effect. We find no evidence for significant neutrino emission and report upper limits on the cumulative unresolved neutrino flux from massive galaxy clusters after accounting for the completeness of the catalog up to a redshift of 2, assuming three different weighting scenarios for the stacking and three different power-law spectra. Weighting the sources according to mass and distance, we set upper limits at a 90% confidence level that constrain the flux of neutrinos from massive galaxy clusters (âł10 Mâ) to be no more than 4.6% of the diffuse IceCube observations at 100 TeV, assuming an unbroken Eâ power-law spectrum
- âŠ