2,108 research outputs found

    Coeducation:A Contested Practice in Nineteenth- and Twentieth-Century Secondary Schooling

    Get PDF
    This chapter discusses the history of coeducation in secondary schooling, mainly in Europe and North America. The analysis focuses on the gendered characteristics of educational systems and curricula, as well as on national discourses about single-sex or mixed schooling. The focus is on the latter half of the nineteenth and the first decades of the twentieth century, when the merits and perils of coeducation were debated for this stage of schooling. Until after World War II, children of the working class hardly ever attended school past the age of 13 or 14. Therefore, this is a history of middle- and upper-class education. In the early nineteenth century, girls had to do with a very limited, private education that prepared only for homemaking and motherhood, while boys could attend public grammar schools that opened the door to the university and the professions. From the mid-nineteenth century, initiatives to improve the quality of girls’ education were taken. Few countries opened up boys’ public schools for girls; in most cases, new girls’ schools were established with more serious but still unequal curricula, focusing mainly on humanities. Schools teaching a curriculum equivalent to that of the boys’ schools were not created until after the turn of the century, when a more critical view of coeducation became the rule. Democratization and coeducation came hand in hand with the introduction of comprehensive mixed secondary schooling in the 1960s and 1970s. The shortcomings of coeducation, however, were not rediscovered until after it had generally been introduced

    The Role of Deontic Logic in the Specification of Information Systems

    Get PDF
    In this paper we discuss the role that deontic logic plays in the specification of information systems, either because constraints on the systems directly concern norms or, and even more importantly, system constraints are considered ideal but violable (so-called `soft¿ constraints).\ud To overcome the traditional problems with deontic logic (the so-called paradoxes), we first state the importance of distinguishing between ought-to-be and ought-to-do constraints and next focus on the most severe paradox, the so-called Chisholm paradox, involving contrary-to-duty norms. We present a multi-modal extension of standard deontic logic (SDL) to represent the ought-to-be version of the Chisholm set properly. For the ought-to-do variant we employ a reduction to dynamic logic, and show how the Chisholm set can be treated adequately in this setting. Finally we discuss a way of integrating both ought-to-be and ought-to-do reasoning, enabling one to draw conclusions from ought-to-be constraints to ought-to-do ones, and show by an example the use(fulness) of this

    Estimating the furrow infiltration characteristic from a single advance point

    Get PDF
    Management and control of surface irrigation, in particular furrow irrigation, is limited by spatio-temporal soil infiltration variability as well as the high cost and time associated with collecting intensive field data for estimation of the infiltration characteristics. Recent work has proposed scaling the commonly used infiltration function by using a model infiltration curve and a single advance point for every other furrow in an irrigation event. Scaling factors were calculated for a series of furrows at two sites and at four points down the length of the field (0.25 L, 0.5 L, 0.75 L and L). Differences in the value of the scaling factor with distance were found to be a function of the shape of the advance curves. It is concluded that use of points early in the advance results in a substantial loss of accuracy and should be avoided. The scaling factor was also strongly correlated with the furrow-wetted perimeter suggesting that the scaling is an appropriate way of both predicting and accommodating the effect of the hydraulic variability

    Genetic risk factors for ischaemic stroke and its subtypes (the METASTROKE Collaboration): a meta-analysis of genome-wide association studies

    Get PDF
    <p>Background - Various genome-wide association studies (GWAS) have been done in ischaemic stroke, identifying a few loci associated with the disease, but sample sizes have been 3500 cases or less. We established the METASTROKE collaboration with the aim of validating associations from previous GWAS and identifying novel genetic associations through meta-analysis of GWAS datasets for ischaemic stroke and its subtypes.</p> <p>Methods - We meta-analysed data from 15 ischaemic stroke cohorts with a total of 12 389 individuals with ischaemic stroke and 62 004 controls, all of European ancestry. For the associations reaching genome-wide significance in METASTROKE, we did a further analysis, conditioning on the lead single nucleotide polymorphism in every associated region. Replication of novel suggestive signals was done in 13 347 cases and 29 083 controls.</p> <p>Findings - We verified previous associations for cardioembolic stroke near PITX2 (p=2·8×10−16) and ZFHX3 (p=2·28×10−8), and for large-vessel stroke at a 9p21 locus (p=3·32×10−5) and HDAC9 (p=2·03×10−12). Additionally, we verified that all associations were subtype specific. Conditional analysis in the three regions for which the associations reached genome-wide significance (PITX2, ZFHX3, and HDAC9) indicated that all the signal in each region could be attributed to one risk haplotype. We also identified 12 potentially novel loci at p<5×10−6. However, we were unable to replicate any of these novel associations in the replication cohort.</p> <p>Interpretation - Our results show that, although genetic variants can be detected in patients with ischaemic stroke when compared with controls, all associations we were able to confirm are specific to a stroke subtype. This finding has two implications. First, to maximise success of genetic studies in ischaemic stroke, detailed stroke subtyping is required. Second, different genetic pathophysiological mechanisms seem to be associated with different stroke subtypes.</p&gt

    Genetic Modulation of Lipid Profiles following Lifestyle Modification or Metformin Treatment: the Diabetes Prevention Program

    Get PDF
    Weight-loss interventions generally improve lipid profiles and reduce cardiovascular disease risk, but effects are variable and may depend on genetic factors. We performed a genetic association analysis of data from 2,993 participants in the Diabetes Prevention Program to test the hypotheses that a genetic risk score (GRS) based on deleterious alleles at 32 lipid-associated single-nucleotide polymorphisms modifies the effects of lifestyle and/or metformin interventions on lipid levels and nuclear magnetic resonance (NMR) lipoprotein subfraction size and number. Twenty-three loci previously associated with fasting LDL-C, HDL-C, or triglycerides replicated (P=0.04–1×1017^{−17}). Except for total HDL particles (r=−0.03, P=0.26), all components of the lipid profile correlated with the GRS (partial |r|=0.07–0.17, P=5×105^{−5}–1×1019^{−19}). The GRS was associated with higher baseline-adjusted 1-year LDL cholesterol levels (β=+0.87, SEE±0.22 mg/dl/allele, P=8×10−5, Pinteraction_{interaction}=0.02) in the lifestyle intervention group, but not in the placebo (β=+0.20, SEE±0.22 mg/dl/allele, P=0.35) or metformin (β=−0.03, SEE±0.22 mg/dl/allele, P=0.90; Pinteraction_{interaction}=0.64) groups. Similarly, a higher GRS predicted a greater number of baseline-adjusted small LDL particles at 1 year in the lifestyle intervention arm (β=+0.30, SEE±0.012 ln nmol/L/allele, P=0.01, Pinteraction_{interaction}=0.01) but not in the placebo (β=−0.002, SEE±0.008 ln nmol/L/allele, P=0.74) or metformin (β=+0.013, SEE±0.008 nmol/L/allele, P=0.12; Pinteraction_{interaction} = 0.24) groups. Our findings suggest that a high genetic burden confers an adverse lipid profile and predicts attenuated response in LDL-C levels and small LDL particle number to dietary and physical activity interventions aimed at weight loss

    Extensive dissolution of live pteropods in the Southern Ocean

    Get PDF
    The carbonate chemistry of the surface ocean is rapidly changing with ocean acidification, a result of human activities. In the upper layers of the Southern Ocean, aragonite—a metastable form of calcium carbonate with rapid dissolution kinetics—may become undersaturated by 2050 (ref. 2). Aragonite undersaturation is likely to affect aragonite-shelled organisms, which can dominate surface water communities in polar regions. Here we present analyses of specimens of the pteropod Limacina helicina antarctica that were extracted live from the Southern Ocean early in 2008. We sampled from the top 200m of the water column, where aragonite saturation levels were around 1, as upwelled deep water is mixed with surface water containing anthropogenic CO2. Comparing the shell structure with samples from aragonite-supersaturated regions elsewhere under a scanning electron microscope, we found severe levels of shell dissolution in the undersaturated region alone. According to laboratory incubations of intact samples with a range of aragonite saturation levels, eight days of incubation in aragonite saturation levels of 0.94– 1.12 produces equivalent levels of dissolution. As deep-water upwelling and CO2 absorption by surface waters is likely to increase as a result of human activities2,4, we conclude that upper ocean regions where aragonite-shelled organisms are affected by dissolution are likely to expand

    Effects of normalization on quantitative traits in association test

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Quantitative trait loci analysis assumes that the trait is normally distributed. In reality, this is often not observed and one strategy is to transform the trait. However, it is not clear how much normality is required and which transformation works best in association studies.</p> <p>Results</p> <p>We performed simulations on four types of common quantitative traits to evaluate the effects of normalization using the logarithm, Box-Cox, and rank-based transformations. The impact of sample size and genetic effects on normalization is also investigated. Our results show that rank-based transformation gives generally the best and consistent performance in identifying the causal polymorphism and ranking it highly in association tests, with a slight increase in false positive rate.</p> <p>Conclusion</p> <p>For small sample size or genetic effects, the improvement in sensitivity for rank transformation outweighs the slight increase in false positive rate. However, for large sample size and genetic effects, normalization may not be necessary since the increase in sensitivity is relatively modest.</p

    SNP selection for genes of iron metabolism in a study of genetic modifiers of hemochromatosis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>We report our experience of selecting tag SNPs in 35 genes involved in iron metabolism in a cohort study seeking to discover genetic modifiers of hereditary hemochromatosis.</p> <p>Methods</p> <p>We combined our own and publicly available resequencing data with HapMap to maximise our coverage to select 384 SNPs in candidate genes suitable for typing on the Illumina platform.</p> <p>Results</p> <p>Validation/design scores above 0.6 were not strongly correlated with SNP performance as estimated by Gentrain score. We contrasted results from two tag SNP selection algorithms, LDselect and Tagger. Varying r<sup>2 </sup>from 0.5 to 1.0 produced a near linear correlation with the number of tag SNPs required. We examined the pattern of linkage disequilibrium of three levels of resequencing coverage for the transferrin gene and found HapMap phase 1 tag SNPs capture 45% of the ≥ 3% MAF SNPs found in SeattleSNPs where there is nearly complete resequencing. Resequencing can reveal adjacent SNPs (within 60 bp) which may affect assay performance. We report the number of SNPs present within the region of six of our larger candidate genes, for different versions of stock genotyping assays.</p> <p>Conclusion</p> <p>A candidate gene approach should seek to maximise coverage, and this can be improved by adding to HapMap data any available sequencing data. Tag SNP software must be fast and flexible to data changes, since tag SNP selection involves iteration as investigators seek to satisfy the competing demands of coverage within and between populations, and typability on the technology platform chosen.</p
    corecore