68 research outputs found

    A common visual metric for approximate number and density.

    Get PDF
    There is considerable interest in how humans estimate the number of objects in a scene in the context of an extensive literature on how we estimate the density (i.e., spacing) of objects. Here, we show that our sense of number and our sense of density are intertwined. Presented with two patches, observers found it more difficult to spot differences in either density or numerosity when those patches were mismatched in overall size, and their errors were consistent with larger patches appearing both denser and more numerous. We propose that density is estimated using the relative response of mechanisms tuned to low and high spatial frequencies (SFs), because energy at high SFs is largely determined by the number of objects, whereas low SF energy depends more on the area occupied by elements. This measure is biased by overall stimulus size in the same way as human observers, and by estimating number using the same measure scaled by relative stimulus size, we can explain all of our results. This model is a simple, biologically plausible common metric for perceptual number and density

    Can Image Enhancement Allow Radiation Dose to Be Reduced Whilst Maintaining the Perceived Diagnostic Image Quality Required for Coronary Angiography?

    Get PDF
    Objectives: The aim of this research was to quantify the reduction in radiation dose facilitated by image processing alone for percutaneous coronary intervention (PCI) patient angiograms, without reducing the perceived image quality required to confidently make a diagnosis. Methods: Incremental amounts of image noise were added to five PCI angiograms, simulating the angiogram as having been acquired at corresponding lower dose levels (10-89% dose reduction). Sixteen observers with relevant experience scored the image quality of these angiograms in three states - with no image processing and with two different modern image processing algorithms applied. These algorithms are used on state-of-the-art and previous generation cardiac interventional X-ray systems. Ordinal regression allowing for random effects and the delta method were used to quantify the dose reduction possible by the processing algorithms, for equivalent image quality scores. Results: Observers rated the quality of the images processed with the state-of-the-art and previous generation image processing with a 24.9% and 15.6% dose reduction respectively as equivalent in quality to the unenhanced images. The dose reduction facilitated by the state-of-the-art image processing relative to previous generation processing was 10.3%. Conclusions: Results demonstrate that statistically significant dose reduction can be facilitated with no loss in perceived image quality using modern image enhancement; the most recent processing algorithm was more effective in preserving image quality at lower doses. Advances in knowledge: Image enhancement was shown to maintain perceived image quality in coronary angiography at a reduced level of radiation dose using computer software to produce synthetic images from real angiograms simulating a reduction in dose

    Parametric study of EEG sensitivity to phase noise during face processing

    Get PDF
    <b>Background: </b> The present paper examines the visual processing speed of complex objects, here faces, by mapping the relationship between object physical properties and single-trial brain responses. Measuring visual processing speed is challenging because uncontrolled physical differences that co-vary with object categories might affect brain measurements, thus biasing our speed estimates. Recently, we demonstrated that early event-related potential (ERP) differences between faces and objects are preserved even when images differ only in phase information, and amplitude spectra are equated across image categories. Here, we use a parametric design to study how early ERP to faces are shaped by phase information. Subjects performed a two-alternative force choice discrimination between two faces (Experiment 1) or textures (two control experiments). All stimuli had the same amplitude spectrum and were presented at 11 phase noise levels, varying from 0% to 100% in 10% increments, using a linear phase interpolation technique. Single-trial ERP data from each subject were analysed using a multiple linear regression model. <b>Results: </b> Our results show that sensitivity to phase noise in faces emerges progressively in a short time window between the P1 and the N170 ERP visual components. The sensitivity to phase noise starts at about 120–130 ms after stimulus onset and continues for another 25–40 ms. This result was robust both within and across subjects. A control experiment using pink noise textures, which had the same second-order statistics as the faces used in Experiment 1, demonstrated that the sensitivity to phase noise observed for faces cannot be explained by the presence of global image structure alone. A second control experiment used wavelet textures that were matched to the face stimuli in terms of second- and higher-order image statistics. Results from this experiment suggest that higher-order statistics of faces are necessary but not sufficient to obtain the sensitivity to phase noise function observed in response to faces. <b>Conclusion: </b> Our results constitute the first quantitative assessment of the time course of phase information processing by the human visual brain. We interpret our results in a framework that focuses on image statistics and single-trial analyses

    The pharmaceutical use of permethrin: Sources and behavior during municipal sewage treatment

    Get PDF
    This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2010 Springer Science+Business Media, LLC.Permethrin entered use in the 1970s as an insecticide in a wide range of applications, including agriculture, horticultural, and forestry, and has since been restricted. In the 21st century, the presence of permethrin in the aquatic environment has been attributed to its use as a human and veterinary pharmaceutical, in particular as a pedeculicide, in addition to other uses, such as a moth-proofing agent. However, as a consequence of its toxicity to fish, sources of permethrin and its fate and behavior during wastewater treatment are topics of concern. This study has established that high overall removal of permethrin (approximately 90%) was achieved during wastewater treatment and that this was strongly dependent on the extent of biological degradation in secondary treatment, with more limited subsequent removal in tertiary treatment processes. Sources of permethrin in the catchment matched well with measured values in crude sewage and indicated that domestic use accounted for more than half of the load to the treatment works. However, removal may not be consistent enough to achieve the environmental quality standards now being derived in many countries even where tertiary treatment processes are applied.United Utilities PL

    Governors and directors: Competing models of corporate governance

    Get PDF
    Why do we use the term ‘corporate governance’ rather than ‘corporate direction’? Early British joint stock companies were normally managed by a single ‘governor’. The ‘court of governors’ or ‘board of directors’ emerged slowly as the ruling body for companies. By the nineteenth century, however, companies were typically run by directors while not-for-profit entities such as hospitals, schools and charitable bodies had governors. The nineteenth century saw steady refinement of the roles of company directors, often in response to corporate scandals, with a gradual change from the notion of the director as a ‘representative shareholder’ to the directors being seen collectively as ‘representatives of the shareholders’. Governors in not-for-profit entities, however, were regarded as having broader responsibilities. The term ‘governance’ itself suggests that corporate boards should be studied as ‘political’ entities rather than merely through economic lenses such as agency theory

    An artificial neural network stratifies the risks of reintervention and mortality after endovascular aneurysm repair:a retrospective observational study

    Get PDF
    Background Lifelong surveillance after endovascular repair (EVAR) of abdominal aortic aneurysms (AAA) is considered mandatory to detect potentially life-threatening endograft complications. A minority of patients require reintervention but cannot be predictively identified by existing methods. This study aimed to improve the prediction of endograft complications and mortality, through the application of machine-learning techniques. Methods Patients undergoing EVAR at 2 centres were studied from 2004-2010. Pre-operative aneurysm morphology was quantified and endograft complications were recorded up to 5 years following surgery. An artificial neural networks (ANN) approach was used to predict whether patients would be at low- or high-risk of endograft complications (aortic/limb) or mortality. Centre 1 data were used for training and centre 2 data for validation. ANN performance was assessed by Kaplan-Meier analysis to compare the incidence of aortic complications, limb complications, and mortality; in patients predicted to be low-risk, versus those predicted to be high-risk. Results 761 patients aged 75 +/- 7 years underwent EVAR. Mean follow-up was 36+/- 20 months. An ANN was created from morphological features including angulation/length/areas/diameters/ volume/tortuosity of the aneurysm neck/sac/iliac segments. ANN models predicted endograft complications and mortality with excellent discrimination between a low-risk and high-risk group. In external validation, the 5-year rates of freedom from aortic complications, limb complications and mortality were 95.9% vs 67.9%; 99.3% vs 92.0%; and 87.9% vs 79.3% respectively (p0.001) Conclusion This study presents ANN models that stratify the 5-year risk of endograft complications or mortality using routinely available pre-operative data

    Fenofibrate in the management of AbdoMinal aortic anEurysm (FAME): Study protocol for a randomised controlled trial

    Get PDF
    Background: Abdominal aortic aneurysm (AAA) is a slowly progressive destructive process of the main abdominal artery. Experimental studies indicate that fibrates exert beneficial effects on AAAs by mechanisms involving both serum lipid modification and favourable changes to the AAA wall. Methods/design: Fenofibrate in the management of AbdoMinal aortic anEurysm (FAME) is a multicentre, randomised, double-blind, placebo-controlled clinical trial to assess the effect of orally administered therapy with fenofibrate on key pathological markers of AAA in patients undergoing open AAA repair. A total of 42 participants scheduled for an elective open AAA repair will be randomly assigned to either 145 mg of fenofibrate per day or identical placebo for a minimum period of 2 weeks prior to surgery. Primary outcome measures will be macrophage number and osteopontin (OPN) concentration within the AAA wall as well as serum concentrations of OPN. Secondary outcome measures will include levels of matrix metalloproteinases and proinflammatory cytokines within the AAA wall, periaortic fat and intramural thrombus and circulating concentrations of AAA biomarkers. Discussion: At present, there is no recognised medical therapy to limit AAA progression. The FAME trial aims to assess the ability of fenofibrate to alter tissue markers of AAA pathology. Trial registration: Australian New Zealand Clinical Trials Registry, ACTRN12612001226897. Registered on 20 November 2012. © 2017 The Author(s)

    Psychophysics with children: Investigating the effects of attentional lapses on threshold estimates

    Get PDF
    When assessing the perceptual abilities of children, researchers tend to use psychophysical techniques designed for use with adults. However, children’s poorer attentiveness might bias the threshold estimates obtained by these methods. Here, we obtained speed discrimination threshold estimates in 6- to 7-year-old children in UK Key Stage 1 (KS1), 7- to 9-year-old children in Key Stage 2 (KS2), and adults using three psychophysical procedures: QUEST, a 1-up 2-down Levitt staircase, and Method of Constant Stimuli (MCS). We estimated inattentiveness using responses to “easy” catch trials. As expected, children had higher threshold estimates and made more errors on catch trials than adults. Lower threshold estimates were obtained from psychometric functions fit to the data in the QUEST condition than the MCS and Levitt staircases, and the threshold estimates obtained when fitting a psychometric function to the QUEST data were also lower than when using the QUEST mode. This suggests that threshold estimates cannot be compared directly across methods. Differences between the procedures did not vary significantly with age group. Simulations indicated that inattentiveness biased threshold estimates particularly when threshold estimates were computed as the QUEST mode or the average of staircase reversals. In contrast, thresholds estimated by post-hoc psychometric function fitting were less biased by attentional lapses. Our results suggest that some psychophysical methods are more robust to attentiveness, which has important implications for assessing the perception of children and clinical groups

    Hypoglycemia and the Origin of Hypoxia-Induced Reduction in Human Fetal Growth

    Get PDF
    The most well known reproductive consequence of residence at high altitude (HA >2700 m) is reduction in fetal growth. Reduced fetoplacental oxygenation is an underlying cause of pregnancy pathologies, including intrauterine growth restriction and preeclampsia, which are more common at HA. Therefore, altitude is a natural experimental model to study the etiology of pregnancy pathophysiologies. We have shown that the proximate cause of decreased fetal growth is not reduced oxygen availability, delivery, or consumption. We therefore asked whether glucose, the primary substrate for fetal growth, might be decreased and/or whether altered fetoplacental glucose metabolism might account for reduced fetal growth at HA.Doppler and ultrasound were used to measure maternal uterine and fetal umbilical blood flows in 69 and 58 residents of 400 vs 3600 m. Arterial and venous blood samples from mother and fetus were collected at elective cesarean delivery and analyzed for glucose, lactate and insulin. Maternal delivery and fetal uptakes for oxygen and glucose were calculated.The maternal arterial – venous glucose concentration difference was greater at HA. However, umbilical venous and arterial glucose concentrations were markedly decreased, resulting in lower glucose delivery at 3600 m. Fetal glucose consumption was reduced by >28%, but strongly correlated with glucose delivery, highlighting the relevance of glucose concentration to fetal uptake. At altitude, fetal lactate levels were increased, insulin concentrations decreased, and the expression of GLUT1 glucose transporter protein in the placental basal membrane was reduced.Our results support that preferential anaerobic consumption of glucose by the placenta at high altitude spares oxygen for fetal use, but limits glucose availability for fetal growth. Thus reduced fetal growth at high altitude is associated with fetal hypoglycemia, hypoinsulinemia and a trend towards lactacidemia. Our data support that placentally-mediated reduction in glucose transport is an initiating factor for reduced fetal growth under conditions of chronic hypoxemia
    • 

    corecore