539 research outputs found

    Quantifying the Amount of Ice in Cold Tropical Cirrus Clouds

    Get PDF
    How much ice is there in the Tropical Tropopause layer, globally? How does one begin to answer that question? Clouds are currently the largest source of uncertainty in climate models, and the ice water content (IWC) of cold cirrus clouds is needed to understand the total water and radiation budgets of the upper troposphere and lower stratosphere (UT/LS). The Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) satellite, originally a "pathfinder" mission only expected to last for three years, has now been operational for more than eight years. Lidar data from CALIPSO can provide information about how IWC is vertically distributed in the UT/LS, and about inter-annual variability and seasonal changes in cloud ice. However, cloud IWC is difficult to measure accurately with either remote or in situ instruments because IWC from cold cirrus clouds is derived from the particle cross-sectional area or visible extinction coefficient. Assumptions must be made about the relationship between the area, volume and density of ice particles with various crystal habits. Recently there have been numerous aircraft field campaigns providing detailed information about cirrus ice water content from cloud probes. This presentation evaluates the assumptions made when creating the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) global IWC data set, using recently reanalyzed aircraft particle probe measurements of very cold, thin TTL cirrus from the 2006 CR-AVE

    Antimicrobial resistance monitoring and surveillance in the meat chain: A report from five countries in the European Union and European Economic Area

    Get PDF
    Background The emergence of antimicrobial resistance (AMR) in zoonotic foodborne pathogens (Salmonella, Campylobacter) and indicator microorganisms (E. coli, enterococci) is a major public health risk. Zoonotic bacteria, resistant to antimicrobials, are of special concern because they might compromise the effective treatment of infections in humans. Scope and approach In this review, the AMR monitoring and surveillance programmes in five selected countries within European Union (EU) and European Economic Area (EEA) are described. The sampling schemes, susceptibility testing for AMR identification, clinical breakpoints (clinical resistance) and epidemiological cut-off values (microbiological resistance) were considered to reflect on the most important variations between and within food-producing animal species, between countries, and to identify the most effective approach to tackle and manage the antimicrobial resistance in the food chain. Key findings and conclusions The science-based monitoring of AMR should encompass the whole food chain, supported with public health surveillance and should be conducted in accordance with ‘Zoonoses Directive’ (99/2003/EC). Such approach encompasses the integrated AMR monitoring in food animals, food and humans in the whole food (meat) chain continuum, e.g. pre-harvest (on-farm), harvest (in abattoir) and post-harvest (at retail). The information on AMR in critically important antimicrobials (CIA) for human medicine should be of particular importance

    Dynamic anoxic ferruginous conditions during the end-Permian mass extinction and recovery

    Get PDF
    The end-Permian mass extinction, ~252 million years ago, is notable for a complex recovery period of ~5 Myr. Widespread euxinic (anoxic and sulfidic) oceanic conditions have been proposed as both extinction mechanism and explanation for the protracted recovery period, yet the vertical distribution of anoxia in the water column and its temporal dynamics through this time period are poorly constrained. Here we utilize Fe–S–C systematics integrated with palaeontological observations to reconstruct a complete ocean redox history for the Late Permian to Early Triassic, using multiple sections across a shelf-to-basin transect on the Arabian Margin (Neo-Tethyan Ocean). In contrast to elsewhere, we show that anoxic non-sulfidic (ferruginous), rather than euxinic, conditions were prevalent in the Neo-Tethys. The Arabian Margin record demonstrates the repeated expansion of ferruginous conditions with the distal slope being the focus of anoxia at these times, as well as short-lived episodes of oxia that supported diverse biota

    A multilevel study of the determinants of area-level inequalities in colorectal cancer survival

    Get PDF
    Background: In Australia, associations between geographic remoteness, socioeconomic disadvantage, and colorectal cancer (CRC) survival show that survival rates are lowest among residents of geographically remote regions and those living in disadvantaged areas. At present we know very little about the reasons for these inequalities, hence our capacity to intervene to reduce the inequalities is limited. Methods/Design: This study, the first of its type in Australia, examines the association between CRC survival and key area- and individual-level factors. Specifically, we will use a multilevel framework to investigate the possible determinants of area- and individual-level inequalities in CRC survival and quantify the relative contribution of geographic remoteness, socioeconomic and demographic factors, disease stage, and access to diagnostic and treatment services, to these inequalities. The multilevel analysis will be based on survival data relating to people diagnosed with CRC in Queensland between 1996 and 2005 (n = 22,723) from the Queensland Cancer Registry (QCR), area-level data from other data custodians such as the Australian Bureau of Statistics, and individual-level data from the QCR (including extracting stage from pathology records) and Queensland Hospitals. For a subset of this period (2003 and 2004) we will utilise more detailed, individual-level data (n = 1,966) covering a greater range of risk factors from a concurrent research study. Geo-coding and spatial technology will be used to calculate road travel distances from patients’ residence to treatment centres. The analyses will be conducted using a multilevel Cox proportional hazards model with Level 1 comprising individual-level factors (e.g. occupation) and level 2 area level indicators of remoteness and area socioeconomic disadvantage. Discussion: This study focuses on the health inequalities for rural and disadvantaged populations that have often been documented but poorly understood, hence limiting our capacity to intervene. This study utilises and develops emerging statistical and spatial technologies that can then be applied to other cancers and health outcomes. The findings of this study will have direct implications for the targeting and resourcing of cancer control programs designed to reduce the burden of colorectal cancer, and for the provision of diagnostic and treatment services

    A Survey on Continuous Time Computations

    Full text link
    We provide an overview of theories of continuous time computation. These theories allow us to understand both the hardness of questions related to continuous time dynamical systems and the computational power of continuous time analog models. We survey the existing models, summarizing results, and point to relevant references in the literature

    Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors

    Get PDF
    Background: Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries. Methods: In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants. Findings: 45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups. Interpretation: Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency. Funding: NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    In the Beginning: The First Sources of Light and the Reionization of the Universe

    Full text link
    The formation of the first stars and quasars marks the transformation of the universe from its smooth initial state to its clumpy current state. In popular cosmological models, the first sources of light began to form at redshift 30 and reionized most of the hydrogen in the universe by redshift 7. Current observations are at the threshold of probing the hydrogen reionization epoch. The study of high-redshift sources is likely to attract major attention in observational and theoretical cosmology over the next decade.Comment: Final revision: 136 pages, including 42 figures; to be published in Physics Reports 2001. References updated, and a few minor corrections made. In this submission, several figures were compressed, resulting in just a slight reduction in quality; a postscript file with the full figures is available at http://www.cita.utoronto.ca/~barkana/review.htm

    MutSβ exceeds MutSα in dinucleotide loop repair

    Get PDF
    The target substrates of DNA mismatch recognising factors MutSalpha (MSH2+MSH6) and MutSbeta (MSH2+MSH3) have already been widely researched. However, the extent of their functional redundancy and clinical substance remains unclear. Mismatch repair (MMR)-deficient tumours are strongly associated with microsatellite instability (MSI) and the degree and type of MSI seem to be dependent on the MMR gene affected, and is linked to its substrate specificities. Deficiency in MSH2 and MSH6 is associated with both mononucleotide and dinucleotide repeat instability. Although no pathogenic MSH3 mutations have been reported, its deficiency is also suggested to cause low dinucleotide repeat instability
    • …
    corecore