105 research outputs found

    Being Modern: The Cultural Impact of Science in the Early Twentieth Century

    Get PDF
    In the early decades of the twentieth century, engagement with science was commonly used as an emblem of modernity. This phenomenon is now attracting increasing attention in different historical specialties. Being Modern builds on this recent scholarly interest to explore engagement with science across culture from the end of the nineteenth century to approximately 1940. Addressing the breadth of cultural forms in Britain and the western world from the architecture of Le Corbusier to working class British science fiction, Being Modern paints a rich picture. Seventeen distinguished contributors from a range of fields including the cultural study of science and technology, art and architecture, English culture and literature examine the issues involved. The book will be a valuable resource for students, and a spur to scholars to further examination of culture as an interconnected web of which science is a critical part, and to supersede such tired formulations as 'Science and culture'

    Formation of MgB2 at low temperatures by reaction of Mg with B6Si

    Full text link
    Formation of MgB2 by reactions of Mg with B6Si and Mg with B were compared, the former also producing Mg2Si as a major product. Compared to the binary system, the ternary reactions for identical time and temperature were more complete at 750 C and below, as indicated by higher diamagnetic shielding and larger x-ray diffraction peak intensities relative to those of Mg. MgB2 could be produced at temperatures as low as 450 C by the ternary reaction. Analyses by electron microscopy, x-ray diffraction, and of the upper critical field show that Si does not enter the MgB2 phase.Comment: Submitted to Supercond. Sci. Techno

    Feature-by-Feature – Evaluating De Novo Sequence Assembly

    Get PDF
    The whole-genome sequence assembly (WGSA) problem is among one of the most studied problems in computational biology. Despite the availability of a plethora of tools (i.e., assemblers), all claiming to have solved the WGSA problem, little has been done to systematically compare their accuracy and power. Traditional methods rely on standard metrics and read simulation: while on the one hand, metrics like N50 and number of contigs focus only on size without proportionately emphasizing the information about the correctness of the assembly, comparisons performed on simulated dataset, on the other hand, can be highly biased by the non-realistic assumptions in the underlying read generator. Recently the Feature Response Curve (FRC) method was proposed to assess the overall assembly quality and correctness: FRC transparently captures the trade-offs between contigs' quality against their sizes. Nevertheless, the relationship among the different features and their relative importance remains unknown. In particular, FRC cannot account for the correlation among the different features. We analyzed the correlation among different features in order to better describe their relationships and their importance in gauging assembly quality and correctness. In particular, using multivariate techniques like principal and independent component analysis we were able to estimate the “excess-dimensionality” of the feature space. Moreover, principal component analysis allowed us to show how poorly the acclaimed N50 metric describes the assembly quality. Applying independent component analysis we identified a subset of features that better describe the assemblers performances. We demonstrated that by focusing on a reduced set of highly informative features we can use the FRC curve to better describe and compare the performances of different assemblers. Moreover, as a by-product of our analysis, we discovered how often evaluation based on simulated data, obtained with state of the art simulators, lead to not-so-realistic results

    Comparing De Novo Genome Assembly: The Long and Short of It

    Get PDF
    Recent advances in DNA sequencing technology and their focal role in Genome Wide Association Studies (GWAS) have rekindled a growing interest in the whole-genome sequence assembly (WGSA) problem, thereby, inundating the field with a plethora of new formalizations, algorithms, heuristics and implementations. And yet, scant attention has been paid to comparative assessments of these assemblers' quality and accuracy. No commonly accepted and standardized method for comparison exists yet. Even worse, widely used metrics to compare the assembled sequences emphasize only size, poorly capturing the contig quality and accuracy. This paper addresses these concerns: it highlights common anomalies in assembly accuracy through a rigorous study of several assemblers, compared under both standard metrics (N50, coverage, contig sizes, etc.) as well as a more comprehensive metric (Feature-Response Curves, FRC) that is introduced here; FRC transparently captures the trade-offs between contigs' quality against their sizes. For this purpose, most of the publicly available major sequence assemblers – both for low-coverage long (Sanger) and high-coverage short (Illumina) reads technologies – are compared. These assemblers are applied to microbial (Escherichia coli, Brucella, Wolbachia, Staphylococcus, Helicobacter) and partial human genome sequences (Chr. Y), using sequence reads of various read-lengths, coverages, accuracies, and with and without mate-pairs. It is hoped that, based on these evaluations, computational biologists will identify innovative sequence assembly paradigms, bioinformaticists will determine promising approaches for developing “next-generation” assemblers, and biotechnologists will formulate more meaningful design desiderata for sequencing technology platforms. A new software tool for computing the FRC metric has been developed and is available through the AMOS open-source consortium

    Observation of the B_c Meson in p-bar p Collisions at sqrt{s} = 1.8 TeV

    Full text link
    We have observed bottom-charm mesons B_c via the decay mode Bc -> J/psi lepton neutrino in 1.8 TeV p-bar p collisions using the CDF detector at the Fermilab Tevatron. A fit of background and signal contributions to the J/psi + lepton mass distribution yielded 20.4 +6.2 -5.5 events from B_c mesons. A fit to the same distribution with background alone was rejected at the level of 4.8 standard deviations. We measured the B_c mass to be 6.40 +- 0.39 +- 0.13 GeVc^2 and the B_c lifetime to be tau(B_c) = 0.46 +0.18 -0.16 +- 0.03 ps. We measured the production cross section times branching ratio for B_c -> J/psi lepton neutrino relative to that for B+ -> J/psi K to be 0.132 +0.041 -0.037 (stat) +- 0.031 (syst) +0.032 -0.020 (lifetime).Comment: 13 pages, 3 figures. Submitted to Physical Review Letters. Available at http://www-cdf.fnal.gov/physics/pub98/cdf4496_Bc_PRL.p
    corecore