278 research outputs found
Investigating the effect of characteristic x-rays in cadmium zinc telluride detectors under breast computerized tomography operating conditions
A number of research groups have been investigating the use of dedicated breast computerized tomography (CT). Preliminary results have been encouraging, suggesting an improved visualization of masses on breast CT as compared to conventional mammography. Nonetheless, there are many challenges to overcome before breast CT can become a routine clinical reality. One potential improvement over current breast CT prototypes would be the use of photon counting detectors with cadmium zinc telluride (CZT) (or CdTe) semiconductor material. These detectors can operate at room temperature and provide high detection efficiency and the capability of multi-energy imaging; however, one factor in particular that limits image quality is the emission of characteristic x-rays. In this study, the degradative effects of characteristic x-rays are examined when using a CZT detector under breast CT operating conditions. Monte Carlo simulation software was used to evaluate the effect of characteristic x-rays and the detector element size on spatial and spectral resolution for a CZT detector used under breast CT operating conditions. In particular, lower kVp spectra and thinner CZT thicknesses were studied than that typically used with CZT based conventional CT detectors. In addition, the effect of characteristic x-rays on the accuracy of material decomposition in spectral CT imaging was explored. It was observed that when imaging with 50-60 kVp spectra, the x-ray transmission through CZT was very low for all detector thicknesses studied (0.5-3.0 mm), thus retaining dose efficiency. As expected, characteristic x-ray escape from the detector element of x-ray interaction increased with decreasing detector element size, approaching a 50% escape fraction for a 100 mum size detector element. The detector point spread function was observed to have only minor degradation with detector element size greater than 200 mum and lower kV settings. Characteristic x-rays produced increasing distortion in the spectral response with decreasing detector element size. If not corrected for, this caused a large bias in estimating tissue density parameters for material decomposition. It was also observed that degradation of the spectral response due to characteristic x-rays caused worsening precision in the estimation of tissue density parameters. It was observed that characteristic x-rays do cause some degradation in the spatial and spectral resolution of thin CZT detectors operating under breast CT conditions. These degradations should be manageable with careful selection of the detector element size. Even with the observed spectral distortion from characteristic x-rays, it is still possible to correctly estimate tissue parameters for material decomposition using spectral CT if accurate modeling is used
Fortunoff Video Archive: New Initiatives for a New Digital Archive
The Fortunoff Archive recently completed a number of important milestones including the digitization of its entire collection, the development of a digital access system, migration of legacy metadata, and the launch of a partner site program that provides remote access to testimonies at universities and research institutes. This paper will present an overview of these new digital initiatives
Creep in Photovoltaic Modules: Examining the Stability of Polymeric Materials and Components
Interest in renewable energy has motivated the implementation of new polymeric materials in photovoltaic modules. Some of these are non-cross-linked thermoplastics, in which there is a potential for new behaviors to occur, including phase transformation and visco-elastic flow. Differential scanning calorimetry and rheometry data were obtained and then combined with existing site-specific time-temperature information in a theoretical analysis to estimate the displacement expected to occur during module service life. The analysis identified that, depending on the installation location, module configuration and/or mounting configuration, some of the thermoplastics are expected to undergo unacceptable physical displacement. While the examples here focus on encapsulation materials, the concerns apply equally to the frame, junction-box, and mounting-adhesive technologies
A Connectionist Approach to Embodied Conceptual Metaphor
A growing body of data has been gathered in support of the view that the mind is embodied and that cognition is grounded in sensory-motor processes. Some researchers have gone so far as to claim that this paradigm poses a serious challenge to central tenets of cognitive science, including the widely held view that the mind can be analyzed in terms of abstract computational principles. On the other hand, computational approaches to the study of mind have led to the development of specific models that help researchers understand complex cognitive processes at a level of detail that theories of embodied cognition (EC) have sometimes lacked. Here we make the case that connectionist architectures in particular can illuminate many surprising results from the EC literature. These models can learn the statistical structure in their environments, providing an ideal framework for understanding how simple sensory-motor mechanisms could give rise to higher-level cognitive behavior over the course of learning. Crucially, they form overlapping, distributed representations, which have exactly the properties required by many embodied accounts of cognition. We illustrate this idea by extending an existing connectionist model of semantic cognition in order to simulate findings from the embodied conceptual metaphor literature. Specifically, we explore how the abstract domain of time may be structured by concrete experience with space (including experience with culturally specific spatial and linguistic cues). We suggest that both EC researchers and connectionist modelers can benefit from an integrated approach to understanding these models and the empirical findings they seek to explain
The City of Louisville Encapsulates the United States Demographics
Background: One weakness that applies to all population-based studies performed in the United States (US) is that investigators perform population-based extrapolations without providing objective statistical evidence to show how well a particular city is a suitable surrogate for the US. The objective of this study was to propose and utilize a novel computational metric to compare individual US cities with the US average.
Methods: This was a secondary data analysis of publicly available databases containing US sociodemographic, economic, and health-related data. In total, 58 demographic, housing, economic, health behavior, and health status variables for each US city with a residential population of at least 500,000 were obtained. All variables were recorded as proportions. Euclidean, Manhattan, and average absolute difference metrics were used to compare the 58 variables to the average in the US.
Results: Oklahoma City, OK, had the lowest distance from the United States, with Euclidean and Manhattan distances in proportion of 0.261 and 1.519, respectively. Louisville, Kentucky, had the second lowest distance for both Euclidean distance and Manhattan distance, with distances of 0.286 and 1.545, respectively. The average absolute differences in proportion for Oklahoma City and Louisville to the US average were 0.026 and 0.027, respectively.
Conclusion: To our knowledge, this represents the first study evaluating a method for computing statistical comparisons of United States city sociodemographic, economic, and health-related data with the United States average. Our study shows that among cities with at least 500,000 residents, Oklahoma City is the closest to the United States, followed closely by Louisville. On average, these cities deviate from the US average on any variable studied by less than 3%
An Emergent Approach to Analogical Inference
In recent years, a growing number of researchers have proposed that analogy is a core component of human cognition. According to the dominant theoretical viewpoint, analogical reasoning requires a specific suite of cognitive machinery, including explicitly coded symbolic representations and a mapping or binding mechanism that operates over these representations. Here we offer an alternative approach: we find that analogical inference can emerge naturally and spontaneously from a relatively simple, error-driven learning mechanism without the need to posit any additional analogy-specific machinery. The results also parallel findings from the developmental literature on analogy, demonstrating a shift from an initial reliance on surface feature similarity to the use of relational similarity later in training. Variants of the model allow us to consider and rule out alternative accounts of its performance. We conclude by discussing how these findings can potentially refine our understanding of the processes that are required to perform analogical inference
Releasing the brakes: a case report of pulmonary arterial hypertension induced by immune checkpoint inhibitor therapy
Immune checkpoint inhibitors successfully treat various malignancies by inducing an immune response to tumor cells. However, their use has been associated with a variety of autoimmune disorders, such as diabetes, hepatitis, and pneumonitis. Pulmonary arterial hypertension due to checkpoint inhibitor use has not yet been described. We present a novel case of pulmonary arterial hypertension associated with systemic lupus erythematosus and Sjogren’s syndrome overlap that was induced by therapy with the checkpoint inhibitor durvalumab
Effectiveness of the Influenza Vaccine in Preventing Hospitalizations of Patients with Influenza Community-Acquired Pneumonia
Introduction: Influenza vaccination is the primary strategy for prevention of influenza infection. Influenza infection can vary from mild or even asymptomatic illness to severe community-acquired pneumonia (CAP). Although many national and international investigators and organizations report annual estimates of influenza vaccine effectiveness for prevention of influenza infection in the community, few studies report estimates for the prevention of hospitalizations due to influenza CAP, the most severe form of the infection. The objective of this study is to determine the effectiveness of the influenza vaccine for prevention of hospitalization in patients with influenza-associated CAP.
Methods: This was a test-negative study using data from the first two years of the University of Louisville Pneumonia Study, a prospective, observational study of all hospitalized patients with pneumonia in Louisville, Kentucky from 6/1/2014 – 5/31/2016. Univariate and multivariate logistic models were used to evaluate the association between vaccine status and influenza-associated/non-influenza-associated CAP hospitalization. Unadjusted and adjusted vaccine effectiveness estimates were calculated.
Results: A total of 1951 hospitalized patients with CAP were included in the analysis, and 831 (43%) reported having received the influenza vaccination for the influenza season by the time they were hospitalized. A total of 152 (8%) cases of influenza-CAP were confirmed in the study population, with 63 (8%) cases confirmed in vaccinated individuals. The unadjusted vaccine effectiveness was not significant, with a point estimate of 5% (95% CI: -33%, 32%). After adjusting for potential cofounders, vaccine effectiveness was also found to not be significant with a point estimate of 8% (95% CI: -30%, 35%).
Conclusions: In conclusion, we found that, over the 2014/2015 and 2015/2016 influenza seasons, influenza vaccine was not effective for prevention of hospitalization with CAP due to influenza. More effective vaccines are necessary to prevent the most serious forms of influenza
- …