196 research outputs found

    LOFAR observations of radio burst source sizes and scattering in the solar corona

    Full text link
    Low frequency radio wave scattering and refraction can have a dramatic effect on the observed size and position of radio sources in the solar corona. The scattering and refraction is thought to be due to fluctuations in electron density caused by turbulence. Hence, determining the true radio source size can provide information on the turbulence in coronal plasma. However, the lack of high spatial resolution radio interferometric observations at low frequencies, such as with the LOw Frequency ARray (LOFAR), has made it difficult to determine the true radio source size and level of radio wave scattering. Here we directly fit the visibilities of a LOFAR observation of a Type IIIb radio burst with an elliptical Gaussian to determine its source size and position. This circumvents the need to image the source and then de-convolve LOFAR's point spread function, which can introduce spurious effects to the source size and shape. For a burst at 34.76 MHz, we find full width at half maximum (FWHM) heights along the major and minor axes to be 18.8′18.8^\prime ± 0.1′\pm~0.1^\prime and 10.2′10.2^\prime ± 0.1′\pm~0.1^\prime, respectively, at a plane of sky heliocentric distance of 1.75 R⊙_\odot. Our results suggest that the level of density fluctuations in the solar corona is the main cause of the scattering of radio waves, resulting in large source sizes. However, the magnitude of ε\varepsilon may be smaller than what has been previously derived in observations of radio wave scattering in tied-array images.Comment: 6 pages, 3 figures, accepted for publication in Astronomy & Astrophysic

    Increased antigen specific T cell numbers in the absence of altered migration or division rates as a result of mucosal cholera toxin administration

    No full text
    Cholera toxin (CT) is a mucosal adjuvant capable of inducing strong immune responses to co-administered antigens following oral or intranasal immunization of mice. To date, the direct effect of CT on antigen-specific CD4(+) T cell migration and proliferation profiles in vivo is not well characterized. In this study, the effect of CT on the migration pattern and proliferative responses of adoptively transferred, CD4(+) TCR transgenic T cells in orally or intranasally vaccinated mice, was analyzed by flow cytometry. GFP-expressing or CFSE-labeled OT-II lymphocytes were adoptively transferred to naïve C57BL/6 mice, and mice were subsequently vaccinated with OVA with or without CT via the oral or intranasal route. CT did not alter the migration pattern of antigen-specific T cells, regardless of the route of immunization, but increased the number of transgenic CD4(+) T cells in draining lymphoid tissue. This increase in the number of transgenic CD4(+) T cells was not due to cells undergoing more rounds of cellular division in vivo, suggesting that CT may exert an indirect adjuvant effect on CD4(+) T cells. The findings reported here suggest that CT functions as a mucosal adjuvant by increasing the number of antigen specific CD4(+) T cells independent of their migration pattern or kinetics of cellular division.Grant support was received from the National Health and Medical Research Council of Australia (NHMRC). OLW is a recipient of an R.D. Wright Career Development Award

    NCAM: a surface marker for human small cell lung cancer cells

    Get PDF
    Immunocytochemical and immunochemical techniques were used to study the expression of the neural cell adhesion molecule (NCAM) by human lung cancer cell lines. Intense surface staining for NCAM was found at light and electron microscopic levels on small cell lung cancer cells. The NCAM polypeptide of Mr 140000 (NCAM 140) was detected by immunoblotting in all of 7 small cell lung cancer cell lines examined and in one out of two of the closely related large cell cancer cell lines: it was not detected in cell lines obtained from one patient with a mesothelioma, in two cases of adenocarcinoma, nor in two cases of squamous cell cancer. In contrast, neuron-specific enolase was found by immunoblotting in all the lung cancer cell lines tested and synaptophysin in all but the adenocarcinoma cell lines. These antigens were localized intracellularly. The specific expression of NCAM 140 by human small and large cell lung carcinomas suggests its potential as a diagnostic marker

    Evaluating the utility of knowledge-based planning for clinical trials using the TROG 08.03 post prostatectomy radiation therapy planning data

    Get PDF
    Background and purpose: Poor quality radiotherapy can detrimentally affect outcomes in clinical trials. Our purpose was to explore the potential of knowledge-based planning (KBP) for quality assurance (QA) in clinical trials. Materials and methods: Using 30 in-house post-prostatectomy radiation treatment (PPRT) plans, an iterative KBP model was created according to the multicentre clinical trial protocol, delivering 64 Gy in 32 fractions. KBP was used to replan 137 plans. The KB (knowledge based) plans were evaluated for their ability to fulfil the trial constraints and were compared against their corresponding original treatment plans (OTP). A second analysis between only the 72 inversely planned OTPs (IP-OTPs) and their corresponding KB plans was performed. Results: All dose constraints were met in 100% of KB plans versus 69% of OTPs. KB plans demonstrated significantly less variation in PTV coverage (Mean dose range: KB plans 64.1 Gy-65.1 Gy vs OTP 63.1 Gy-67.3 Gy, p \u3c 0.01). KBP resulted in significantly lower doses to OARs. Rectal V60Gy and V40Gy were 17.7% vs 27.7% (p \u3c 0.01) and 40.5% vs 53.9% (p \u3c 0.01) for KB plans and OTP respectively. Left femoral head (FH) V45Gy and V35Gy were 0.4% vs 7.4% (p \u3c 0.01) and 7.9% vs 34.9% (p \u3c 0.01) respectively. In the second analysis plan improvements were maintained. Conclusions: KBP created high quality PPRT plans using the data from a multicentre clinical trial in a single optimisation. It is a powerful tool for utilisation in clinical trials for patient specific QA, to reduce dose to surrounding OARs and variations in plan quality which could impact on clinical trial outcomes

    Fundus topographical distribution patterns of ocular toxoplasmosis

    Get PDF
    BACKGROUND: To establish topographic maps and determine fundus distribution patterns of ocular toxoplasmosis (OT) lesions. METHODS: In this retrospective study, patients who presented with OT to ophthalmology clinics from four countries (Argentina, Turkey, UK, USA) were included. Size, shape and location of primary (1°)/recurrent (2°) and active/inactive lesions were converted into a two-dimensional retinal chart by a retinal drawing software. A final contour map of the merged image charts was then created using a custom Matlab programme. Descriptive analyses were performed. RESULTS: 984 lesions in 514 eyes of 464 subjects (53% women) were included. Mean area of all 1° and 2° lesions was 5.96±12.26 and 5.21±12.77 mm2, respectively. For the subset group lesions (eyes with both 1° and 2° lesions), 1° lesions were significantly larger than 2° lesions (5.52±6.04 mm2 vs 4.09±8.90 mm2, p=0.038). Mean distances from foveola to 1° and 2° lesion centres were 6336±4267 and 5763±3491 µm, respectively. The majority of lesions were found in temporal quadrant (p<0.001). Maximum overlap of all lesions was at 278 µm inferotemporal to foveola. CONCLUSION: The 1° lesions were larger than 2° lesions. The 2° lesions were not significantly closer to fovea than 1° lesions. Temporal quadrant and macular region were found to be densely affected underlining the vision threatening nature of the disease

    Nurse staffing and education and hospital mortality in nine European countries: a retrospective observational study

    Get PDF
    Background Austerity measures and health-system redesign to minimise hospital expenditures risk adversely affecting patient outcomes. The RN4CAST study was designed to inform decision making about nursing, one of the largest components of hospital operating expenses. We aimed to assess whether differences in patient to nurse ratios and nurses' educational qualifications in nine of the 12 RN4CAST countries with similar patient discharge data were associated with variation in hospital mortality after common surgical procedures. Methods For this observational study, we obtained discharge data for 422 730 patients aged 50 years or older who underwent common surgeries in 300 hospitals in nine European countries. Administrative data were coded with a standard protocol (variants of the ninth or tenth versions of the International Classification of Diseases) to estimate 30 day in-hospital mortality by use of risk adjustment measures including age, sex, admission type, 43 dummy variables suggesting surgery type, and 17 dummy variables suggesting comorbidities present at admission. Surveys of 26 516 nurses practising in study hospitals were used to measure nurse staffing and nurse education. We used generalised estimating equations to assess the effects of nursing factors on the likelihood of surgical patients dying within 30 days of admission, before and after adjusting for other hospital and patient characteristics. Findings An increase in a nurses' workload by one patient increased the likelihood of an inpatient dying within 30 days of admission by 7% (odds ratio 1·068, 95% CI 1·031–1·106), and every 10% increase in bachelor's degree nurses was associated with a decrease in this likelihood by 7% (0·929, 0·886–0·973). These associations imply that patients in hospitals in which 60% of nurses had bachelor's degrees and nurses cared for an average of six patients would have almost 30% lower mortality than patients in hospitals in which only 30% of nurses had bachelor's degrees and nurses cared for an average of eight patients. Interpretation Nurse staffing cuts to save money might adversely affect patient outcomes. An increased emphasis on bachelor's education for nurses could reduce preventable hospital deaths. Funding European Union's Seventh Framework Programme, National Institute of Nursing Research, National Institutes of Health, the Norwegian Nurses Organisation and the Norwegian Knowledge Centre for the Health Services, Swedish Association of Health Professionals, the regional agreement on medical training and clinical research between Stockholm County Council and Karolinska Institutet, Committee for Health and Caring Sciences and Strategic Research Program in Care Sciences at Karolinska Institutet, Spanish Ministry of Science and Innovation

    Repeated magmatic intrusions at El Hierro Island following the 2011–2012 submarine eruption

    Get PDF
    After more than 200 years of quiescence, in July 2011 an intense seismic swarm was detected beneath the center of El Hierro Island (Canary Islands), culminating on 10 October 2011 in a submarine eruption, 2 km off the southern coast. Although the eruption officially ended on 5 March 2012, magmatic activity continued in the area. From June 2012 to March 2014, six earthquake swarms, indicative of magmatic intrusions, were detected underneath the island. We have studied these post-eruption intrusive events using GPS and InSAR techniques to characterize the ground surface deformation produced by each of these intrusions, and to determine the optimal source parameters (geometry, location, depth, volume change). Source inversions provide insight into the depth of the intrusions (~ 11–16 km) and the volume change associated with each of them (between 0.02 and 0.13 km3). During this period, > 20 cm of uplift was detected in the central-western part of the island, corresponding to approximately 0.32–0.38 km3 of magma intruded beneath the volcano. We suggest that these intrusions result from deep magma migrating from the mantle, trapped at the mantle/lower crust discontinuity in the form of sill-like bodies. This study, using joint inversion of GPS and InSAR data in a post-eruption period, provides important insight into the characteristics of the magmatic plumbing system of El Hierro, an oceanic intraplate volcanic island

    Effects of Dibutyryl Cyclic-AMP on Survival and Neuronal Differentiation of Neural Stem/Progenitor Cells Transplanted into Spinal Cord Injured Rats

    Get PDF
    Neural stem/progenitor cells (NSPCs) have great potential as a cell replacement therapy for spinal cord injury. However, poor control over transplant cell differentiation and survival remain major obstacles. In this study, we asked whether dibutyryl cyclic-AMP (dbcAMP), which was shown to induce up to 85% in vitro differentiation of NSPCs into neurons would enhance survival of transplanted NSPCs through prolonged exposure either in vitro or in vivo through the controlled release of dbcAMP encapsulated within poly(lactic-co-glycolic acid) (PLGA) microspheres and embedded within chitosan guidance channels. NSPCs, seeded in fibrin scaffolds within the channels, differentiated in vitro to betaIII-tubulin positive neurons by immunostaining and mRNA expression, in response to dbcAMP released from PLGA microspheres. After transplantation in spinal cord injured rats, the survival and differentiation of NSPCs was evaluated. Untreated NSPCs, NSPCs transplanted with dbcAMP-releasing microspheres, and NSPCs pre-differentiated with dbcAMP for 4 days in vitro were transplanted after rat spinal cord transection and assessed 2 and 6 weeks later. Interestingly, NSPC survival was highest in the dbcAMP pre-treated group, having approximately 80% survival at both time points, which is remarkable given that stem cell transplantation often results in less than 1% survival at similar times. Importantly, dbcAMP pre-treatment also resulted in the greatest number of in vivo NSPCs differentiated into neurons (37±4%), followed by dbcAMP-microsphere treated NSPCs (27±14%) and untreated NSPCs (15±7%). The reverse trend was observed for NSPC-derived oligodendrocytes and astrocytes, with these populations being highest in untreated NSPCs. This combination strategy of stem cell-loaded chitosan channels implanted in a fully transected spinal cord resulted in extensive axonal regeneration into the injury site, with improved functional recovery after 6 weeks in animals implanted with pre-differentiated stem cells in chitosan channels

    Atlas of the Global Burden of Stroke (1990-2013): The GBD 2013 Study

    Get PDF
    Background—World mapping is an important tool to visualize stroke burden and its trends in various regions and countries. Objectives—To show geographic patterns of incidence, prevalence, mortality, disability-adjusted life-years (DALYs) and years lived with disability (YLDs), and their trends for ischemic stroke (IS) and hemorrhagic stroke (HS) in the world for 1990 to 2013. Methodology—Stroke incidence, prevalence, mortality, DALYs and YLDs were estimated following the general approach of the Global Burden of Disease (GBD) 2010 with several important improvements in methods. Data were updated for mortality (through April 2014) and stroke incidence, prevalence, case fatality, and severity through 2013. Death was estimated using an ensemble modelling approach. A new software package, DisMod-MR 2.0 was used as part of a custom modelling process to estimate YLDS. All rates were age-standardized to new GBD estimates of global population. All estimates have been computed with 95% uncertainty intervals (UI). Results—Age-standardized incidence, mortality, prevalence and DALYs/YLDs declined over the period from 1990 to 2013. However, the absolute number of people affected by stroke has substantially increased across all countries in the world over the same time period, suggesting that the global stroke burden continues to increase. There were significant geographical (country and regional) differences in stroke burden in the world, with the majority of the burden borne by low- and middle-income countries. Conclusions—Global burden of stroke has continued to increase in spite of dramatic declines in age-standardized incidence, prevalence, mortality rates, and disability. Population growth and ageing have played an important role in the observed increase in stroke burden
    • …
    corecore