2,087 research outputs found

    Bounds on isocurvature perturbations from CMB and LSS data

    Full text link
    We obtain very stringent bounds on the possible cold dark matter, baryon and neutrino isocurvature contributions to the primordial fluctuations in the Universe, using recent cosmic microwave background and large scale structure data. In particular, we include the measured temperature and polarization power spectra from WMAP and ACBAR, as well as the matter power spectrum from the 2dF galaxy redshift survey. Neglecting the possible effects of spatial curvature, tensor perturbations and reionization, we perform a Bayesian likelihood analysis with nine free parameters, and find that the amplitude of the isocurvature component cannot be larger than about 31% for the cold dark matter mode, 91% for the baryon mode, 76% for the neutrino density mode, and 60% for the neutrino velocity mode, at 2-sigma, for uncorrelated models. On the other hand, for correlated adiabatic and isocurvature components, the fraction could be slightly larger. However, the cross-correlation coefficient is strongly constrained, and maximally correlated/anticorrelated models are disfavored. This puts strong bounds on the curvaton model, independently of the bounds on non-Gaussianity.Comment: 4 pages, 1 figure, some minor corrections; version accepted in PR

    Application of Fractal Dimension for Quantifying Noise Texture in Computed Tomography Images

    Get PDF
    Purpose Evaluation of noise texture information in CT images is important for assessing image quality. Noise texture is often quantified by the noise power spectrum (NPS), which requires numerous image realizations to estimate. This study evaluated fractal dimension for quantifying noise texture as a scalar metric that can potentially be estimated using one image realization. Methods The American College of Radiology CT accreditation phantom (ACR) was scanned on a clinical scanner (Discovery CT750, GE Healthcare) at 120 kV and 25 and 90 mAs. Images were reconstructed using filtered back projection (FBP/ASIR 0%) with varying reconstruction kernels: Soft, Standard, Detail, Chest, Lung, Bone, and Edge. For each kernel, images were also reconstructed using ASIR 50% and ASIR 100% iterative reconstruction (IR) methods. Fractal dimension was estimated using the differential box‐counting algorithm applied to images of the uniform section of ACR phantom. The two‐dimensional Noise Power Spectrum (NPS) and one‐dimensional‐radially averaged NPS were estimated using established techniques. By changing the radiation dose, the effect of noise magnitude on fractal dimension was evaluated. The Spearman correlation between the fractal dimension and the frequency of the NPS peak was calculated. The number of images required to reliably estimate fractal dimension was determined and compared to the number of images required to estimate the NPS‐peak frequency. The effect of Region of Interest (ROI) size on fractal dimension estimation was evaluated. Feasibility of estimating fractal dimension in an anthropomorphic phantom and clinical image was also investigated, with the resulting fractal dimension compared to that estimated within the uniform section of the ACR phantom. Results Fractal dimension was strongly correlated with the frequency of the peak of the radially averaged NPS curve, having a Spearman rank‐order coefficient of 0.98 (P‐value \u3c 0.01) for ASIR 0%. The mean fractal dimension at ASIR 0% was 2.49 (Soft), 2.51 (Standard), 2.52 (Detail), 2.57 (Chest), 2.61 (Lung), 2.66 (Bone), and 2.7 (Edge). A reduction in fractal dimension was observed with increasing ASIR levels for all investigated reconstruction kernels. Fractal dimension was found to be independent of noise magnitude. Fractal dimension was successfully estimated from four ROIs of size 64 × 64 pixels or one ROI of 128 × 128 pixels. Fractal dimension was found to be sensitive to non‐noise structures in the image, such as ring artifacts and anatomical structure. Fractal dimension estimated within a uniform region of an anthropomorphic phantom and clinical head image matched that estimated within the ACR phantom for filtered back projection reconstruction. Conclusions Fractal dimension correlated with the NPS‐peak frequency and was independent of noise magnitude, suggesting that the scalar metric of fractal dimension can be used to quantify the change in noise texture across reconstruction approaches. Results demonstrated that fractal dimension can be estimated from four, 64 × 64‐pixel ROIs or one 128 × 128 ROI within a head CT image, which may make it amenable for quantifying noise texture within clinical images

    Progress in Developing Hybrid RPCs: GEM-like Detectors with Resistive Electrodes

    Full text link
    We have recently developed an innovative detector of photons and charged particles: a GEM-like gaseous amplification structure with resistive electrodes instead of commonly used metallic ones. This novel detector combines the best property of GEMs- the capability to operate in a cascaded mode and in poorly quenched gases - and of RPC: the protection against sparks. In this paper will shortly review our latest achievements in this direction, however the main focus will be given on a new advanced design that allows to build large area detectors manufactured by a screen printing technology. The proposed detector, depending on the applications, can operate either in a GEM mode (electron multiplications through holes only) or as a hybrid RPC with simultaneous amplifications in the drift region and in the holes. The possible applications of this new detector will be discussed

    A survey of attitudes toward visual training in the Northwest

    Get PDF
    A questionnaire was sent to one third of the ophthalmologists and optometrists in Oregon and Washington . It contained questions pertaining to practitioner attitudes toward their educational backgrounds in visual training. Questions dealing with some of the controversial issues in visual training\u27s role in strabismus and amblyopia therapy were also included. Lastly, profile information and data concerning the practice in general was gathered from each survey recipient. The respondent population was divided into groups by profession and extent of VT offered. The different groups responses were then tabulated and statistically compared within and between professions

    CT Automated Exposure Control Using A Generalized Detectability Index

    Get PDF
    Purpose Identifying an appropriate tube current setting can be challenging when using iterative reconstruction due to the varying relationship between spatial resolution, contrast, noise, and dose across different algorithms. This study developed and investigated the application of a generalized detectability index (d\u27gen) to determine the noise parameter to input to existing automated exposure control (AEC) systems to provide consistent image quality (IQ) across different reconstruction approaches. Methods This study proposes a task‐based automated exposure control (AEC) method using a generalized detectability index (d\u27gen). The proposed method leverages existing AEC methods that are based on a prescribed noise level. The generalized d\u27gen metric is calculated using lookup tables of task‐based modulation transfer function (MTF) and noise power spectrum (NPS). To generate the lookup tables, the American College of Radiology CT accreditation phantom was scanned on a multidetector CT scanner (Revolution CT, GE Healthcare) at 120 kV and tube current varied manually from 20 to 240 mAs. Images were reconstructed using a reference reconstruction algorithm and four levels of an in‐house iterative reconstruction algorithm with different regularization strengths (IR1–IR4). The task‐based MTF and NPS were estimated from the measured images to create lookup tables of scaling factors that convert between d\u27gen and noise standard deviation. The performance of the proposed d\u27gen‐AEC method in providing a desired IQ level over a range of iterative reconstruction algorithms was evaluated using the American College of Radiology (ACR) phantom with elliptical shell and using a human reader evaluation on anthropomorphic phantom images. Results The study of the ACR phantom with elliptical shell demonstrated reasonable agreement between the d\u27gen predicted by the lookup table and d\u27 measured in the images, with a mean absolute error of 15% across all dose levels and maximum error of 45% at the lowest dose level with the elliptical shell. For the anthropomorphic phantom study, the mean reader scores for images resulting from the d\u27gen‐AEC method were 3.3 (reference image), 3.5 (IR1), 3.6 (IR2), 3.5 (IR3), and 2.2 (IR4). When using the d\u27gen‐AEC method, the observers’ IQ scores for the reference reconstruction were statistical equivalent to the scores for IR1, IR2, and IR3 iterative reconstructions (P \u3e 0.35). The d\u27gen‐AEC method achieved this equivalent IQ at lower dose for the IR scans compared to the reference scans. Conclusions A novel AEC method, based on a generalized detectability index, was investigated. The proposed method can be used with some existing AEC systems to derive the tube current profile for iterative reconstruction algorithms. The results provide preliminary evidence that the proposed d\u27gen‐AEC can produce similar IQ across different iterative reconstruction approaches at different dose levels

    The balance between the pro-inflammatory effect of plasma noradrenaline and the anti-inflammatory effect of neuronal noradrenaline determines the peripheral effect of noradrenaline

    Get PDF
    Perfusion experiments on an isolated, canine lateral saphenous vein segment preparation have shown that noradrenaline causes potent, flow dependent effects, at a threshold concentration comparable to that of plasma noradrenaline, when it stimulates the segment by diffusion from its microcirculation (vasa vasorum). The effects caused are opposite to those neuronal noradrenaline causes in vivo and that, in the light of the principle that all information is transmitted in patterns that need contrast to be detected – star patterns need darkness, sound patterns, quietness – has generated the hypothesis that plasma noradrenaline provides the obligatory contrast tissues need to detect and respond to the regulatory information encrypted in the diffusion pattern of neuronal noradrenaline. Based on the implications of that hypothesis, the controlled variable of the peripheral noradrenergic system is believed to be the maintenance of a set point balance between the contrasting effects of plasma and neuronal noradrenaline on a tissue. The hypothalamic sympathetic centres are believed to monitor that balance through the level of afferent sympathetic traffic they receive from a tissue and to correct any deviation it detects in the balance by adjusting the level of efferent sympathetic input it projects to the tissue. The failure of the centres to maintain the correct balance, for reasons intrinsic or extrinsic to themselves, is believed to be responsible for degenerative and genetic disorders. When the failure causes the balance to be polarised in favour of the effect of plasma noradrenaline that is believed to cause inflammatory diseases like dilator cardiac failure, renal hypertension, varicose veins and aneurysms; when it causes it to be polarised in favour of the effect of neuronal noradrenaline that is believed to cause genetic diseases like hypertrophic cardiopathy, pulmonary hypertension and stenoses and when, in pregnancy, a factor causes the polarity to favour plasma noradrenaline in all the maternal tissues except the uterus and conceptus, where it favours neuronal noradrenaline, that is believed to cause preeclampsia

    Financialisation at a Watershed in the USA

    Get PDF
    In the period following the Great Recession of 2007-9 the financialisation of the US economy reached a watershed characterised by stagnant financial profits, falling proportions of financial sector and mortgage debt, and rising proportion of public debt. The main macroeconomic indicators of financialisation in the USA show structural breaks that can be dated around the period of the Great Recession. The reliance of households on the formal financial system appears to have weakened for the first time since the early 1980s. The financial sector has lacked the dynamism of the previous three decades becoming more reliant on government. The state has increased its own indebtedness and supported large financial institutions via unconventional monetary policy measures. At the same time, state intervention has tightened the regulatory framework for big banks. The future path of financialisation in the USA will depend heavily on government policy with regard to state debt and financial regulation, although the scope for boosting financialisation is narrow
    • …
    corecore