94 research outputs found

    Ignition of solid propellants by forced convection

    Full text link
    Experimental data are reported for the ignition of single grains of solid propellant in a stream of gas at high temperature. The investigation encompassed gas temperatures from 578° to 1,070°K., gas velocities corresponding to free-stream Reynolds numbers from 156 to 624, a complete range of oxygen-nitrogen mixtures, and a few oxygen-carbon dioxide mixtures. Pyrocellulose and double-base propellants were tested. The grains were approximately 1/8 in. in diameter and extended through the gas stream, so that ignition was forced to take place on the cylindrical surface rather than on the end of the grain. The exposure before ignition was measured for a large number of grains. The data can be represented by an equation that is consistent with the known effect of flow rate on convective heat transfer and the known effect of temperature on chemical reaction rates, an indication that both processes are important in ignition.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/37288/1/690020427_ftp.pd

    Power laws of complex systems from Extreme physical information

    Full text link
    Many complex systems obey allometric, or power, laws y=Yx^{a}. Here y is the measured value of some system attribute a, Y is a constant, and x is a stochastic variable. Remarkably, for many living systems the exponent a is limited to values +or- n/4, n=0,1,2... Here x is the mass of a randomly selected creature in the population. These quarter-power laws hold for many attributes, such as pulse rate (n=-1). Allometry has, in the past, been theoretically justified on a case-by-case basis. An ultimate goal is to find a common cause for allometry of all types and for both living and nonliving systems. The principle I - J = extrem. of Extreme physical information (EPI) is found to provide such a cause. It describes the flow of Fisher information J => I from an attribute value a on the cell level to its exterior observation y. Data y are formed via a system channel function y = f(x,a), with f(x,a) to be found. Extremizing the difference I - J through variation of f(x,a) results in a general allometric law f(x,a)= y = Yx^{a}. Darwinian evolution is presumed to cause a second extremization of I - J, now with respect to the choice of a. The solution is a=+or-n/4, n=0,1,2..., defining the particular powers of biological allometry. Under special circumstances, the model predicts that such biological systems are controlled by but two distinct intracellular information sources. These sources are conjectured to be cellular DNA and cellular transmembrane ion gradient

    Commercially Available Outbred Mice for Genome-Wide Association Studies

    Get PDF
    Genome-wide association studies using commercially available outbred mice can detect genes involved in phenotypes of biomedical interest. Useful populations need high-frequency alleles to ensure high power to detect quantitative trait loci (QTLs), low linkage disequilibrium between markers to obtain accurate mapping resolution, and an absence of population structure to prevent false positive associations. We surveyed 66 colonies for inbreeding, genetic diversity, and linkage disequilibrium, and we demonstrate that some have haplotype blocks of less than 100 Kb, enabling gene-level mapping resolution. The same alleles contribute to variation in different colonies, so that when mapping progress stalls in one, another can be used in its stead. Colonies are genetically diverse: 45% of the total genetic variation is attributable to differences between colonies. However, quantitative differences in allele frequencies, rather than the existence of private alleles, are responsible for these population differences. The colonies derive from a limited pool of ancestral haplotypes resembling those found in inbred strains: over 95% of sequence variants segregating in outbred populations are found in inbred strains. Consequently it is possible to impute the sequence of any mouse from a dense SNP map combined with inbred strain sequence data, which opens up the possibility of cataloguing and testing all variants for association, a situation that has so far eluded studies in completely outbred populations. We demonstrate the colonies' potential by identifying a deletion in the promoter of H2-Ea as the molecular change that strongly contributes to setting the ratio of CD4+ and CD8+ lymphocytes

    Reviewing the use of resilience concepts in forest sciences

    Get PDF
    Purpose of the review Resilience is a key concept to deal with an uncertain future in forestry. In recent years, it has received increasing attention from both research and practice. However, a common understanding of what resilience means in a forestry context, and how to operationalise it is lacking. Here, we conducted a systematic review of the recent forest science literature on resilience in the forestry context, synthesising how resilience is defined and assessed. Recent findings Based on a detailed review of 255 studies, we analysed how the concepts of engineering resilience, ecological resilience, and social-ecological resilience are used in forest sciences. A clear majority of the studies applied the concept of engineering resilience, quantifying resilience as the recovery time after a disturbance. The two most used indicators for engineering resilience were basal area increment and vegetation cover, whereas ecological resilience studies frequently focus on vegetation cover and tree density. In contrast, important social-ecological resilience indicators used in the literature are socio-economic diversity and stock of natural resources. In the context of global change, we expected an increase in studies adopting the more holistic social-ecological resilience concept, but this was not the observed trend. Summary Our analysis points to the nestedness of these three resilience concepts, suggesting that they are complementary rather than contradictory. It also means that the variety of resilience approaches does not need to be an obstacle for operationalisation of the concept. We provide guidance for choosing the most suitable resilience concept and indicators based on the management, disturbance and application context

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe

    A Reinterpretation of the Turbulent Prandtl Number †

    No full text

    The Art of Correlation

    No full text

    LETTERS - "American Engineering System of Units and It's Dimensional Constants g"

    No full text
    corecore