979 research outputs found

    \u27An Explosive of Quite Unimaginable Force\u27: Did Werner Heisenberg Obstruct German Atomic Bomb Research?

    Get PDF
    Why was Nazi Germany unable to acquire an atomic bomb during World War II? An answer to this question necessarily involves an analysis of the wartime conduct of Werner Heisenberg. As the undisputed leader of German nuclear research, Heisenberg was integral to the successful production of a bomb. Heisenberg claimed after the war that the Nazis lacked the economic resources for this project. Moreover, Nazi military strategy ruled out such a sustained long-term commitment in armaments development. Heisenberg explained that he personally felt fortunate that these circumstances prevented Hitler from having a bomb. He argued that he merely “pretended” to pursue a bomb for the Nazis. Far from supporting a bomb, Heisenberg’s explanations reveal two wartime objectives: a) to preserve the German scientific enterprise from destruction by Nazi ideology, and b) to prevent the production of an atomic bomb. Some historians believe Heisenberg’s post-war explanations of his conduct were a disingenuous effort to obscure his scientific shortcomings and Nazi support. But if Heisenberg’s explanations of his conduct were genuine, what should the historian expect to find in his wartime historical record? Assuming that Heisenberg was only “pretending” to support the Nazis, one can expect to find: 1) a mix of pro- and anti-Nazi actions, 2) no gratuitous pro-Nazi actions devoid of any strategic value for his objectives. This paper uses the above interpretive framework to analyze four key wartime actions of Heisenberg’s: 1) a 1941 conversation with the Danish physicist Niels Bohr, 2) a February 1942 lecture for Nazi officials, 3) a similar lecture in June 1942, and 4) Heisenberg’s acceptance of the Nazi-appointed director position at the Kaiser Wilhelm Institute for Physics. I conclude that while the historical evidence is not uniformly in his favor, Heisenberg’s explanation of his wartime conduct and objectives holds up under a closer examination

    IDENTIFICATION AND CLINICAL ASSESSMENT OF DELETION STRUCTURAL VARIANTS IN WHOLE GENOME SEQUENCES OF ACUTELY ILL NEONATES

    Get PDF
    BACKGROUND Effective management of acutely ill newborns with genetic conditions requires rapid and comprehensive identification of causative haplotypes. It has been previously shown that whole genome sequencing (WGS) can identify small variants contributing to the genetic illness of such patients in less than 50 hours. Deletion structural variants (SVs) 50 nucleotides are implicated in many genetic diseases and with WGS data can now be identified with a performance and timeframe sufficient for diagnosis in neonatal intensive care units. Here we describe the development of a solution that combines consensus calls from two SV detection tools (Breakdancer [BD] and GenomeStrip [GS]) with a novel filtering strategy. RESULTS WGS simulation data demonstrated BD and GS consensus calls had 83% sensitivity and 99% positive predictive value with high precision. Through raw data inspection in the integrated genome viewer (IGV) consensus calls overlapping with SNP arrays were found to be 95% true positive and were subsequently used for filter parameterization. Consensus calling and filtering were implemented as a computational pipeline. IGV evaluation of pipeline results in a tetrad demonstrated calls were over 80% true positive but insensitive. Pipeline usage in 10 proband family sets revealed a possibly causative deletion SV in the MMP21 gene for two siblings. MMP21 is thought to play a role in embryogenesis in humans and may be responsible for the heterotaxy phenotype in humans. Further studies are needed to confirm these results. CONCLUSIONS The identification of deletion SVs has the potential to increase the diagnostic yield of WGS data. The methods described in this study may be useful in the research of disease detection in acutely ill neonates

    Teaching Archival Research Skills to Undergraduates

    Get PDF
    Despite changes in theory and approach, academic archivists continue to struggle in their attempts to meet the research needs of undergraduate instructors and students. College and university archives generally are not utilized to their greatest potential by undergraduates. One explanation is that one or both user groups may be unaware of the subject coverage of their university or college archives’ collections. Another possibility is that some instructors may not be aware of archivists’ willingness to collaborate with them on students’ research projects. Other instructors find it difficult to incorporate the use of primary sources and student research in the archives into their current teaching practices. On this point, archivists can be cautiously optimistic as there is evidence that younger, non-tenured history faculty tend to utilize primary sources more often in their teaching than older, tenured faculty. This paper will undertake another explanation of the problem of underutilized academic archives and propose an agenda for resolving it. Undergraduate students generally do not have the requisite knowledge and skills for effective archival research. This lack of proficiency in basic principles of archival research prevents students from pursuing significant online and on-site research. Archival research skills and classroom exposure to primary sources are incommensurate in recent educational practice. As noted above, the use of primary sources in all levels of instruction is on the rise. Recent evidence of this is the Common Core State Standards Initiative, a set of national curriculum standards for K-12 students, adopted by many states beginning in 2010. They emphasize the development of research skills and critical thinking about primary sources. Doris J. Malkmus thus argues that students are “arriving on college campuses more prepared to deal with primary source documents than any previous generation, but they have not yet developed the skills to find and identify primary sources—whether online or in the archives.

    Crowdsourcing Transcriptions of Archival Materials

    Get PDF
    Crowdsourcing is a method that has been effectively used to pool the knowledge and skills of large numbers of online volunteers for the creation of information resources utilized by historians, genealogists, and scientists. In recent years, archivists have begun to crowdsource the transcription of their handwritten records. Transcription of such records has traditionally been completed by professional transcribers who are skilled in reading multiple handwriting styles, knowledgeable about the creators and historical context of the records, and can interpret varying record formats and genres. However, increasingly limited resources of time and money have made traditional transcription more difficult to accomplish. This paper evaluates the crowdsourcing of transcriptions under three major archival principles: processing, accessibility, and outreach. Crowdsourcing is one processing solution to backlogs of archival records requiring transcription. There are both human and technical issues requiring resolution in the production of transcriptions by online volunteers. Transcription of records results in increased accessibility on multiple levels; transcribed records are: 1) more readable, 2) keyword searchable in databases. Crowdsourcing transcriptions results in greater awareness of the archives being transcribed in the public and among potential users. A final archival principle, preservation, is only briefly discussed due to the limited data available on how crowdsourcing transcriptions has affected the continued use of original records. The numerous crowdsourced transcription projects now underway in the field of archives will provide an experiential component to this paper’s analysis. Crowdsourced transcription projects to be examined include, among others, Transcribe Bentham, Ancestry.com’s World Archives Project, and the Papers of the War Department

    A Whole-Chromosome Analysis of Meiotic Recombination in Drosophila melanogaster

    Get PDF
    Although traditional genetic assays have characterized the pattern of crossing over across the genome in Drosophila melanogaster, these assays could not precisely define the location of crossovers. Even less is known about the frequency and distribution of noncrossover gene conversion events. To assess the specific number and positions of both meiotic gene conversion and crossover events, we sequenced the genomes of male progeny from females heterozygous for 93,538 X chromosomal single-nucleotide and InDel polymorphisms. From the analysis of the 30 F1 hemizygous X chromosomes, we detected 15 crossover and 5 noncrossover gene conversion events. Taking into account the nonuniform distribution of polymorphism along the chromosome arm, we estimate that most oocytes experience 1 crossover event and 1.6 gene conversion events per X chromosome pair per meiosis. An extrapolation to the entire genome would predict approximately 5 crossover events and 8.6 conversion events per meiosis. Mean gene conversion tract lengths were estimated to be 476 base pairs, yielding a per nucleotide conversion rate of 0.86 × 10−5 per meiosis. Both of these values are consistent with estimates of conversion frequency and tract length obtained from studies of rosy, the only gene for which gene conversion has been studied extensively in Drosophila. Motif-enrichment analysis revealed a GTGGAAA motif that was enriched near crossovers but not near gene conversions. The low-complexity and frequent occurrence of this motif may in part explain why, in contrast to mammalian systems, no meiotic crossover hotspots have been found in Drosophila

    Measurement of the top quark forward-backward production asymmetry and the anomalous chromoelectric and chromomagnetic moments in pp collisions at √s = 13 TeV

    Get PDF
    Abstract The parton-level top quark (t) forward-backward asymmetry and the anomalous chromoelectric (d̂ t) and chromomagnetic (Ό̂ t) moments have been measured using LHC pp collisions at a center-of-mass energy of 13 TeV, collected in the CMS detector in a data sample corresponding to an integrated luminosity of 35.9 fb−1. The linearized variable AFB(1) is used to approximate the asymmetry. Candidate t t ÂŻ events decaying to a muon or electron and jets in final states with low and high Lorentz boosts are selected and reconstructed using a fit of the kinematic distributions of the decay products to those expected for t t ÂŻ final states. The values found for the parameters are AFB(1)=0.048−0.087+0.095(stat)−0.029+0.020(syst),Ό̂t=−0.024−0.009+0.013(stat)−0.011+0.016(syst), and a limit is placed on the magnitude of | d̂ t| < 0.03 at 95% confidence level. [Figure not available: see fulltext.

    MUSiC : a model-unspecific search for new physics in proton-proton collisions at root s=13TeV

    Get PDF
    Results of the Model Unspecific Search in CMS (MUSiC), using proton-proton collision data recorded at the LHC at a centre-of-mass energy of 13 TeV, corresponding to an integrated luminosity of 35.9 fb(-1), are presented. The MUSiC analysis searches for anomalies that could be signatures of physics beyond the standard model. The analysis is based on the comparison of observed data with the standard model prediction, as determined from simulation, in several hundred final states and multiple kinematic distributions. Events containing at least one electron or muon are classified based on their final state topology, and an automated search algorithm surveys the observed data for deviations from the prediction. The sensitivity of the search is validated using multiple methods. No significant deviations from the predictions have been observed. For a wide range of final state topologies, agreement is found between the data and the standard model simulation. This analysis complements dedicated search analyses by significantly expanding the range of final states covered using a model independent approach with the largest data set to date to probe phase space regions beyond the reach of previous general searches.Peer reviewe

    Search for new particles in events with energetic jets and large missing transverse momentum in proton-proton collisions at root s=13 TeV

    Get PDF
    A search is presented for new particles produced at the LHC in proton-proton collisions at root s = 13 TeV, using events with energetic jets and large missing transverse momentum. The analysis is based on a data sample corresponding to an integrated luminosity of 101 fb(-1), collected in 2017-2018 with the CMS detector. Machine learning techniques are used to define separate categories for events with narrow jets from initial-state radiation and events with large-radius jets consistent with a hadronic decay of a W or Z boson. A statistical combination is made with an earlier search based on a data sample of 36 fb(-1), collected in 2016. No significant excess of events is observed with respect to the standard model background expectation determined from control samples in data. The results are interpreted in terms of limits on the branching fraction of an invisible decay of the Higgs boson, as well as constraints on simplified models of dark matter, on first-generation scalar leptoquarks decaying to quarks and neutrinos, and on models with large extra dimensions. Several of the new limits, specifically for spin-1 dark matter mediators, pseudoscalar mediators, colored mediators, and leptoquarks, are the most restrictive to date.Peer reviewe

    Measurement of prompt open-charm production cross sections in proton-proton collisions at root s=13 TeV

    Get PDF
    The production cross sections for prompt open-charm mesons in proton-proton collisions at a center-of-mass energy of 13TeV are reported. The measurement is performed using a data sample collected by the CMS experiment corresponding to an integrated luminosity of 29 nb(-1). The differential production cross sections of the D*(+/-), D-+/-, and D-0 ((D) over bar (0)) mesons are presented in ranges of transverse momentum and pseudorapidity 4 < p(T) < 100 GeV and vertical bar eta vertical bar < 2.1, respectively. The results are compared to several theoretical calculations and to previous measurements.Peer reviewe

    Combined searches for the production of supersymmetric top quark partners in proton-proton collisions at root s=13 TeV

    Get PDF
    A combination of searches for top squark pair production using proton-proton collision data at a center-of-mass energy of 13 TeV at the CERN LHC, corresponding to an integrated luminosity of 137 fb(-1) collected by the CMS experiment, is presented. Signatures with at least 2 jets and large missing transverse momentum are categorized into events with 0, 1, or 2 leptons. New results for regions of parameter space where the kinematical properties of top squark pair production and top quark pair production are very similar are presented. Depending on themodel, the combined result excludes a top squarkmass up to 1325 GeV for amassless neutralino, and a neutralinomass up to 700 GeV for a top squarkmass of 1150 GeV. Top squarks with masses from 145 to 295 GeV, for neutralino masses from 0 to 100 GeV, with a mass difference between the top squark and the neutralino in a window of 30 GeV around the mass of the top quark, are excluded for the first time with CMS data. The results of theses searches are also interpreted in an alternative signal model of dark matter production via a spin-0 mediator in association with a top quark pair. Upper limits are set on the cross section for mediator particle masses of up to 420 GeV
    • 

    corecore