157 research outputs found

    Investigating the effect of characteristic x-rays in cadmium zinc telluride detectors under breast computerized tomography operating conditions

    Get PDF
    A number of research groups have been investigating the use of dedicated breast computerized tomography (CT). Preliminary results have been encouraging, suggesting an improved visualization of masses on breast CT as compared to conventional mammography. Nonetheless, there are many challenges to overcome before breast CT can become a routine clinical reality. One potential improvement over current breast CT prototypes would be the use of photon counting detectors with cadmium zinc telluride (CZT) (or CdTe) semiconductor material. These detectors can operate at room temperature and provide high detection efficiency and the capability of multi-energy imaging; however, one factor in particular that limits image quality is the emission of characteristic x-rays. In this study, the degradative effects of characteristic x-rays are examined when using a CZT detector under breast CT operating conditions. Monte Carlo simulation software was used to evaluate the effect of characteristic x-rays and the detector element size on spatial and spectral resolution for a CZT detector used under breast CT operating conditions. In particular, lower kVp spectra and thinner CZT thicknesses were studied than that typically used with CZT based conventional CT detectors. In addition, the effect of characteristic x-rays on the accuracy of material decomposition in spectral CT imaging was explored. It was observed that when imaging with 50-60 kVp spectra, the x-ray transmission through CZT was very low for all detector thicknesses studied (0.5-3.0 mm), thus retaining dose efficiency. As expected, characteristic x-ray escape from the detector element of x-ray interaction increased with decreasing detector element size, approaching a 50% escape fraction for a 100 mum size detector element. The detector point spread function was observed to have only minor degradation with detector element size greater than 200 mum and lower kV settings. Characteristic x-rays produced increasing distortion in the spectral response with decreasing detector element size. If not corrected for, this caused a large bias in estimating tissue density parameters for material decomposition. It was also observed that degradation of the spectral response due to characteristic x-rays caused worsening precision in the estimation of tissue density parameters. It was observed that characteristic x-rays do cause some degradation in the spatial and spectral resolution of thin CZT detectors operating under breast CT conditions. These degradations should be manageable with careful selection of the detector element size. Even with the observed spectral distortion from characteristic x-rays, it is still possible to correctly estimate tissue parameters for material decomposition using spectral CT if accurate modeling is used

    A Connectionist Approach to Embodied Conceptual Metaphor

    Get PDF
    A growing body of data has been gathered in support of the view that the mind is embodied and that cognition is grounded in sensory-motor processes. Some researchers have gone so far as to claim that this paradigm poses a serious challenge to central tenets of cognitive science, including the widely held view that the mind can be analyzed in terms of abstract computational principles. On the other hand, computational approaches to the study of mind have led to the development of specific models that help researchers understand complex cognitive processes at a level of detail that theories of embodied cognition (EC) have sometimes lacked. Here we make the case that connectionist architectures in particular can illuminate many surprising results from the EC literature. These models can learn the statistical structure in their environments, providing an ideal framework for understanding how simple sensory-motor mechanisms could give rise to higher-level cognitive behavior over the course of learning. Crucially, they form overlapping, distributed representations, which have exactly the properties required by many embodied accounts of cognition. We illustrate this idea by extending an existing connectionist model of semantic cognition in order to simulate findings from the embodied conceptual metaphor literature. Specifically, we explore how the abstract domain of time may be structured by concrete experience with space (including experience with culturally specific spatial and linguistic cues). We suggest that both EC researchers and connectionist modelers can benefit from an integrated approach to understanding these models and the empirical findings they seek to explain

    An Emergent Approach to Analogical Inference

    Get PDF
    In recent years, a growing number of researchers have proposed that analogy is a core component of human cognition. According to the dominant theoretical viewpoint, analogical reasoning requires a specific suite of cognitive machinery, including explicitly coded symbolic representations and a mapping or binding mechanism that operates over these representations. Here we offer an alternative approach: we find that analogical inference can emerge naturally and spontaneously from a relatively simple, error-driven learning mechanism without the need to posit any additional analogy-specific machinery. The results also parallel findings from the developmental literature on analogy, demonstrating a shift from an initial reliance on surface feature similarity to the use of relational similarity later in training. Variants of the model allow us to consider and rule out alternative accounts of its performance. We conclude by discussing how these findings can potentially refine our understanding of the processes that are required to perform analogical inference

    Joint AAPM Task Group 282/EFOMP Working Group Report: Breast dosimetry for standard and contrast‐enhanced mammography and breast tomosynthesis

    Get PDF
    : Currently, there are multiple breast dosimetry estimation methods for mammography and its variants in use throughout the world. This fact alone introduces uncertainty, since it is often impossible to distinguish which model is internally used by a specific imaging system. In addition, all current models are hampered by various limitations, in terms of overly simplified models of the breast and its composition, as well as simplistic models of the imaging system. Many of these simplifications were necessary, for the most part, due to the need to limit the computational cost of obtaining the required dose conversion coefficients decades ago, when these models were first implemented. With the advancements in computational power, and to address most of the known limitations of previous breast dosimetry methods, a new breast dosimetry method, based on new breast models, has been developed, implemented, and tested. This model, developed jointly by the American Association of Physicists in Medicine and the European Federation for Organizations of Medical Physics, is applicable to standard mammography, digital breast tomosynthesis, and their contrast-enhanced variants. In addition, it includes models of the breast in both the cranio-caudal and the medio-lateral oblique views. Special emphasis was placed on the breast and system models used being based on evidence, either by analysis of large sets of patient data or by performing measurements on imaging devices from a range of manufacturers. Due to the vast number of dose conversion coefficients resulting from the developed model, and the relative complexity of the calculations needed to apply it, a software program has been made available for download or online use, free of charge, to apply the developed breast dosimetry method. The program is available for download or it can be used directly online. A separate User's Guide is provided with the software

    Cost-effectiveness of financial incentives to promote adherence to depot antipsychotic medication: economic evaluation of a cluster-randomised controlled trial

    Get PDF
    Background: Offering a modest financial incentive to people with psychosis can promote adherence to depot antipsychotic medication, but the cost-effectiveness of this approach has not been examined. Methods: Economic evaluation within a pragmatic cluster-randomised controlled trial. 141 patients under the care of 73 teams (clusters) were randomised to intervention or control; 138 patients with diagnoses of schizophrenia, schizo-affective disorder or bipolar disorder participated. Intervention participants received £15 per depot injection over 12 months, additional to usual acute, mental and community primary health services. The control group received usual health services. Main outcome measures: incremental cost per 20% increase in adherence to depot antipsychotic medication; incremental cost of ‘good’ adherence (defined as taking at least 95% of the prescribed number of depot medications over the intervention period). Findings: Economic and outcome data for baseline and 12-month follow-up were available for 117 participants. The adjusted difference in adherence between groups was 12.2% (73.4% control vs. 85.6% intervention); the adjusted costs difference was £598 (95% CI -£4 533, £5 730). The extra cost per patient to increase adherence to depot medications by 20% was £982 (95% CI -£8 020, £14 000). The extra cost per patient of achieving 'good' adherence was £2 950 (CI -£19 400, £27 800). Probability of cost-effectiveness exceeded 97.5%at willingness-to-pay values of £14 000 for a 20% increase in adherence and £27 800 for good adherence. Interpretation: Offering a modest financial incentive to people with psychosis is cost-effective in promoting adherence to depot antipsychotic medication. Direct healthcare costs (including costs of the financial incentive) are unlikely to be increased by this intervention. Trial Registration: ISRCTN.com 7776928

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be ∌24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with ÎŽ<+34.5∘\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r∌27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie

    Choosing the target difference and undertaking and reporting the sample size calculation for a randomised controlled trial – the development of the DELTA2 guidance

    Get PDF
    Background A key step in the design of a randomised controlled trial is the estimation of the number of participants needed. The most common approach is to specify a target difference in the primary outcome between the randomised groups and then estimate the corresponding sample size. The sample size is chosen to provide reassurance that the trial will have high statistical power to detect the target difference at the planned statistical significance level. Alternative approaches are also available, though most still require specification of a target difference. The sample size has many implications for the conduct of the study, as well as incurring scientific and ethical aspects. Despite the critical role of the target difference for the primary outcome in the design of a randomised controlled trial (RCT), the manner in which it is determined has received little attention. This article reports the development of the DELTA2 guidance on the specification and reporting of the target difference for the primary outcome in a sample size calculation for a RCT. Methods The DELTA2 (Difference ELicitation in TriAls) project has five components comprising systematic literature reviews of recent methodological developments (stage 1) and existing funder guidance (stage 2), a Delphi study (stage 3), a 2-day consensus meeting bringing together researchers, funders and patient representatives (stage 4), and the preparation and dissemination of a guidance document (stage 5). Results The project started in April 2016. The literature search identified 28 articles of methodological developments relevant to a method for specifying a target difference. A Delphi study involving 69 participants, along with a 2-day consensus meeting were conducted. In addition, further engagement sessions were held at two international conferences. The main guidance text was finalised on April 18, 2018, after revision informed by feedback gathered from stages 2 and 3 and from funder representatives. Discussion The DELTA2 Delphi study identified a number of areas (such as practical recommendations and examples, greater coverage of different trial designs and statistical approaches) of particular interest amongst stakeholders which new guidance was desired to meet. New relevant references were identified by the review. Such findings influenced the scope, drafting and revision of the guidance. While not all suggestions could be accommodated, it is hoped that the process has led to a more useful and practical document. Keywords Target difference Clinically important difference Sample size Guidance Randomised tria

    A Wake-Up Call: Information Contagion and Strategic Uncertainty

    Full text link
    A successful speculative attack against one currency is a wake-up call for speculators elsewhere. Currency speculators have an incentive to acquire costly information about exposures across countries to infer whether their monetary authority's ability to defend its currency is weakened. Information acquisition per se increases the likelihood of speculative currency attacks via heightened strategic uncertainty among speculators. Contagion occurs even if speculators learn that there is no exposure. Our new contagion mechanism offers a compelling explanation for the 1997 Asian currency crisis and the 1998 Russian crisis, both of which spread across countries with seemingly unrelated fundamentals and limited interconnectedness. The proposed contagion mechanism applies generally in global coordination games and can also be applied to bank runs, sovereign debt crises, and political regime change

    Facing differences with an open mind: Openness to Experience, salience of intra-group differences, and performance of diverse groups.

    Get PDF
    This study examined how the performance of diverse teams is affected by member openness to experience and the extent to which team reward structure emphasizes intragroup differences. Fifty-eight heterogeneous four-person teams engaged in an interactive task. Teams in which reward structure converged with diversity (i.e., "faultline" teams) performed more poorly than teams in which reward structure cut across differences between group members or pointed to a "superordinate identity." High openness to experience positively influenced teams in which differences were salient (i.e., faultline and "cross-categorized" teams) but not teams with a superordinate identity. This effect was mediated by information elaboration
    • 

    corecore