1,277 research outputs found

    Structure and optical properties of Lu <inf>2</inf>SiO <inf>5</inf>:Ce phosphor thin films

    Get PDF
    Luminescent, cerium doped Lu 2SiO 5 thin films with C2/c symmetry have been prepared by pulsed laser deposition (PLD) at temperatures much lower than the crystallization temperature (2150Β°C) of the corresponding bulk crystals. The PLD grown films show the typical luminescence resulting from the Ce 3+ 5d-4f transition. Maximum luminescence efficiency was observed for films prepared at an oxygen partial pressure of 200 mTorr at 600Β°C. These conditions reflect a balance between Ce 4+/Ce 3+ interconversion and the crystalline quality of the films. The results indicate that PLD offers a low temperature deposition technique for complex oxide phosphor materials. Β© 2006 American Institute of Physics

    Luminescent properties and reduced dimensional behavior of hydrothermally prepared Y <inf>2</inf>SiO <inf>5</inf>: Ce nanophosphors

    Get PDF
    Hydrothermally prepared nanophosphor Y2 Si O5: Ce crystallizes in the P 21 c structure, rather than the B2b structure observed in bulk material. Relative to bulk powder, nanophosphors of particle size ∼25-100 nm diameter exhibit redshifts of the photoluminescence excitation and emission spectra, reduced self absorption, enhanced light output, and medium-dependent radiative lifetime. Photoluminescence data are consistent with reduced symmetry of the P 21 c structure and are not necessarily related to reduced dimensionality of the nanophosphor. In contrast, medium-dependent lifetime and enhanced light output are attributed to nanoscale behavior. Perturbation of the Ce ion electric field is responsible for the variable lifetime. © 2006 American Institute of Physics

    The validity of using ICD-9 codes and pharmacy records to identify patients with chronic obstructive pulmonary disease

    Get PDF
    Background: Administrative data is often used to identify patients with chronic obstructive pulmonary disease (COPD), yet the validity of this approach is unclear. We sought to develop a predictive model utilizing administrative data to accurately identify patients with COPD. Methods: Sequential logistic regression models were constructed using 9573 patients with postbronchodilator spirometry at two Veterans Affairs medical centers (2003-2007). COPD was defined as: 1) FEV1/FVC <0.70, and 2) FEV1/FVC < lower limits of normal. Model inputs included age, outpatient or inpatient COPD-related ICD-9 codes, and the number of metered does inhalers (MDI) prescribed over the one year prior to and one year post spirometry. Model performance was assessed using standard criteria. Results: 4564 of 9573 patients (47.7%) had an FEV1/FVC < 0.70. The presence of β‰₯1 outpatient COPD visit had a sensitivity of 76% and specificity of 67%; the AUC was 0.75 (95% CI 0.74-0.76). Adding the use of albuterol MDI increased the AUC of this model to 0.76 (95% CI 0.75-0.77) while the addition of ipratropium bromide MDI increased the AUC to 0.77 (95% CI 0.76-0.78). The best performing model included: β‰₯6 albuterol MDI, β‰₯3 ipratropium MDI, β‰₯1 outpatient ICD-9 code, β‰₯1 inpatient ICD-9 code, and age, achieving an AUC of 0.79 (95% CI 0.78-0.80). Conclusion: Commonly used definitions of COPD in observational studies misclassify the majority of patients as having COPD. Using multiple diagnostic codes in combination with pharmacy data improves the ability to accurately identify patients with COPD.Department of Veterans Affairs, Health Services Research and Development (DHA), American Lung Association (CI- 51755-N) awarded to DHA, the American Thoracic Society Fellow Career Development AwardPeer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/84155/1/Cooke - ICD9 validity in COPD.pd

    Discourse or dialogue? Habermas, the Bakhtin Circle, and the question of concrete utterances

    Get PDF
    This is the author's accepted manuscript. The final publication is available at Springer via the link below.This article argues that the Bakhtin Circle presents a more realistic theory of concrete dialogue than the theory of discourse elaborated by Habermas. The Bakhtin Circle places speech within the β€œconcrete whole utterance” and by this phrase they mean that the study of everyday language should be analyzed through the mediations of historical social systems such as capitalism. These mediations are also characterized by a determinate set of contradictionsβ€”the capital-labor contradiction in capitalism, for exampleβ€”that are reproduced in unique ways in more concrete forms of life (the state, education, religion, culture, and so on). Utterances always dialectically refract these processes and as such are internal concrete moments, or concrete social forms, of them. Moreover, new and unrepeatable dialogic events arise in these concrete social forms in order to overcome and understand the constant dialectical flux of social life. But this theory of dialogue is different from that expounded by Habermas, who tends to explore speech acts by reproducing a dualism between repeatable and universal β€œabstract” discursive processes (commonly known as the ideal speech situation) and empirical uses of discourse. These critical points against Habermas are developed by focusing on six main areas: sentences and utterances; the lifeworld and background language; active versus passive understandings of language; validity claims; obligation and relevance in language; and dialectical universalism

    A Synthesis of Tagging Studies Examining the Behaviour and Survival of Anadromous Salmonids in Marine Environments

    Get PDF
    This paper synthesizes tagging studies to highlight the current state of knowledge concerning the behaviour and survival of anadromous salmonids in the marine environment. Scientific literature was reviewed to quantify the number and type of studies that have investigated behaviour and survival of anadromous forms of Pacific salmon (Oncorhynchus spp.), Atlantic salmon (Salmo salar), brown trout (Salmo trutta), steelhead (Oncorhynchus mykiss), and cutthroat trout (Oncorhynchus clarkii). We examined three categories of tags including electronic (e.g. acoustic, radio, archival), passive (e.g. external marks, Carlin, coded wire, passive integrated transponder [PIT]), and biological (e.g. otolith, genetic, scale, parasites). Based on 207 papers, survival rates and behaviour in marine environments were found to be extremely variable spatially and temporally, with some of the most influential factors being temperature, population, physiological state, and fish size. Salmonids at all life stages were consistently found to swim at an average speed of approximately one body length per second, which likely corresponds with the speed at which transport costs are minimal. We found that there is relatively little research conducted on open-ocean migrating salmonids, and some species (e.g. masu [O. masou] and amago [O. rhodurus]) are underrepresented in the literature. The most common forms of tagging used across life stages were various forms of external tags, coded wire tags, and acoustic tags, however, the majority of studies did not measure tagging/handling effects on the fish, tag loss/failure, or tag detection probabilities when estimating survival. Through the interdisciplinary application of existing and novel technologies, future research examining the behaviour and survival of anadromous salmonids could incorporate important drivers such as oceanography, tagging/handling effects, predation, and physiology

    An Investigation into the Poor Survival of an Endangered Coho Salmon Population

    Get PDF
    To investigate reasons for the decline of an endangered population of coho salmon (O. kisutch), 190 smolts were acoustically tagged during three consecutive years and their movements and survival were estimated using the Pacific Ocean Shelf Tracking project (POST) array. Median travel times of the Thompson River coho salmon smolts to the lower Fraser River sub-array were 16, 12 and 10 days during 2004, 2005 and 2006, respectively. Few smolts were recorded on marine arrays. Freshwater survival rates of the tagged smolts during their downstream migration were 0.0–5.6% (0.0–9.0% s.e.) in 2004, 7.0% (6.2% s.e.) in 2005, and 50.9% (18.6% s.e.) in 2006. Overall smolt-to-adult return rates exhibited a similar pattern, which suggests that low freshwater survival rates of out-migrating smolts may be a primary reason for the poor conservation status of this endangered coho salmon population

    Characterization of the mouse Dazap1 gene encoding an RNA-binding protein that interacts with infertility factors DAZ and DAZL

    Get PDF
    BACKGROUND: DAZAP1 (DAZ Associated Protein 1) was originally identified by a yeast two-hybrid system through its interaction with a putative male infertility factor, DAZ (Deleted in Azoospermia). In vitro, DAZAP1 interacts with both the Y chromosome-encoded DAZ and an autosome-encoded DAZ-like protein, DAZL. DAZAP1 contains two RNA-binding domains (RBDs) and a proline-rich C-terminal portion, and is expressed most abundantly in the testis. To understand the biological function of DAZAP1 and the significance of its interaction with DAZ and DAZL, we isolated and characterized the mouse Dazap1 gene, and studied its expression and the subcellular localization of its protein product. RESULTS: The human and mouse genes have similar genomic structures and map to syntenic chromosomal regions. The mouse and human DAZAP1 proteins share 98% identity and their sequences are highly similar to the Xenopus orthologue Prrp, especially in the RBDs. Dazap1 is expressed throughout testis development. Western blot detects a single 45 kD DAZAP1 protein that is most abundant in the testis. Although a majority of DAZAP1 is present in the cytoplasmic fraction, they are not associated with polyribosomes. CONCLUSIONS: DAZAP1 is evolutionarily highly conserved. Its predominant expression in testes suggests a role in spermatogenesis. Its subcellular localization indicates that it is not directly involved in mRNA translation

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    Β© 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. Β© 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    New life sciences innovation and distributive justice: rawlsian goods versus senian capabilities

    Get PDF
    The successful decoding of human genome and subsequent advances in new life sciences innovation create technological presuppositions of a new possibility of justice i.e. the just distribution of both social (income, wealth, etc.) and natural (rationality, intelligence, etc.) goods. Although Rawlsians attempt to expand their theory to include this new possibility, they fail to provide plausible metrics of social justice in the genomics and post-genomics era. By contrast, Senians seem to succeed to do so through their index of basic capabilities. This paper explores what might be regarded as a Senian perspective of distributive justice in new life sciences innovation. The argument is that, by comparing freedoms (different functionings) instead of primary goods, the capability theory allows not only for the identification of injustices linked to natural lottery but also for their elimination through the use of new genomic technologies, including gene-based diagnostics, gene therapy, somatic cell engineering (SCE) and germ-line engineering (GLE). These innovative technologies seem to have the potential to reduce variability in natural goods and therefore enable individuals to convert social goods into well-being or welfare
    • …
    corecore