109 research outputs found

    Calibration, foreground subtraction, and signal extraction in hydrogen cosmology

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Physics, 2012.This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Cataloged from student submitted PDF version of thesis.Includes bibliographical references (p. 265-271).By using the hyperfine 21 cm transition to map out the distribution of neutral hydrogen at high redshifts, hydrogen cosmology has the potential to place exquisite constraints on fundamental cosmological parameters, as well as to provide direct observations of our Universe prior to the formation of the fist luminous objects. However, this theoretical promise has yet to become observational reality. Chief amongst the observational obstacles are a need for extremely well-calibrated instruments and methods for dealing with foreground contaminants such as Galactic synchrotron radiation. In this thesis we explore a number of these challenges by proposing and testing a variety of techniques for calibration, foreground subtraction, and signal extraction in hydrogen cosmology. For tomographic hydrogen cosmology experiments, we explore a calibration algorithm known as redundant baseline calibration, extending treatments found in the existing literature to include rigorous calculations of uncertainties and extensions to not-quite-redundant baselines. We use a principal component analysis to model foregrounds, and take advantage of the resulting sparseness of foreground spectra to propose various foreground subtraction algorithms. These include fitting low-order polynomials to spectra (either in image space or Fourier space) and inverse variance weighting. The latter method is described in a unified mathematical framework that includes power spectrum estimation. Foreground subtraction is also explored in the context of global signal experiments, and data analysis methods that incorporate angular information are presented. Finally, we apply many of the aforementioned methods to data from the Murchison Widefield Array, placing an upper limit on the Epoch of Reionization power spectrum at redshift z = 9:1.by Adrian Chi-Yan Liu.Ph.D

    An Improved Method for 21cm Foreground Removal

    Get PDF
    21-cm tomography is expected to be difficult in part because of serious foreground contamination. Previous studies have found that line-of-sight approaches are capable of cleaning foregrounds to an acceptable level on large spatial scales, but not on small spatial scales. In this paper, we introduce a Fourier space formalism for describing the line-of-sight methods, and use it to introduce an improved new method for 21-cm foreground cleaning. Heuristically, this method involves fitting foregrounds in Fourier space using weighted polynomial fits, with each pixel weighted according to its information content. We show that the new method reproduces the old one on large angular scales, and gives marked improvements on small scales at essentially no extra computational cost.National Science Foundation (U.S.) (Grant AST-0134999)National Science Foundation (U.S.) (Grant AST-05-06556)David & Lucile Packard FoundationResearch Corporatio

    Mapmaking for precision 21 cm cosmology

    Get PDF
    In order to study the “Cosmic Dawn” and the Epoch of Reionization with 21 cm tomography, we need to statistically separate the cosmological signal from foregrounds known to be orders of magnitude brighter. Over the last few years, we have learned much about the role our telescopes play in creating a putatively foreground-free region called the “EoR window.” In this work, we examine how an interferometer’s effects can be taken into account in a way that allows for the rigorous estimation of 21 cm power spectra from interferometric maps while mitigating foreground contamination and thus increasing sensitivity. This requires a precise understanding of the statistical relationship between the maps we make and the underlying true sky. While some of these calculations would be computationally infeasible if performed exactly, we explore several well-controlled approximations that make mapmaking and the calculation of map statistics much faster, especially for compact and highly redundant interferometers designed specifically for 21 cm cosmology. We demonstrate the utility of these methods and the parametrized trade-offs between accuracy and speed using one such telescope, the upcoming Hydrogen Epoch of Reionization Array, as a case study.National Science Foundation (U.S.) (Grant AST-0457585)National Science Foundation (U.S.) (Grant AST-0821321)National Science Foundation (U.S.) (Grant AST-0804508)National Science Foundation (U.S.) (Grant AST-1105835)National Science Foundation (U.S.) (Grant AST-1125558)National Science Foundation (U.S.) (Grant AST-1129258)National Science Foundation (U.S.) (Grant AST-1410484)National Science Foundation (U.S.) (Grant AST-1411622)Mount Cuba Astronomical AssociationMIT School of ScienceMarble Astrophysics Fun

    PAPER-64 CONSTRAINTS ON REIONIZATION: THE 21 cm POWER SPECTRUM AT z = 8.4

    Get PDF
    In this paper, we report new limits on 21 cm emission from cosmic reionization based on a 135 day observing campaign with a 64-element deployment of the Donald C. Backer Precision Array for Probing the Epoch of Reionization in South Africa. This work extends the work presented in Parsons et al. with more collecting area, a longer observing period, improved redundancy-based calibration, improved fringe-rate filtering, and updated power-spectral analysis using optimal quadratic estimators. The result is a new 2σ upper limit on Δ[superscript 2](k) of (22.4 mK)[superscript 2] in the range 0.15 < k < 0.5h Mpc[superscript -1] at z = 8.4. This represents a three-fold improvement over the previous best upper limit. As we discuss in more depth in a forthcoming paper, this upper limit supports and extends previous evidence against extremely cold reionization scenarios. We conclude with a discussion of implications for future 21 cm reionization experiments, including the newly funded Hydrogen Epoch of Reionization Array

    Mapping our universe in 3D with MITEoR

    Get PDF
    Mapping our universe in 3D by imaging the redshifted 21 cm line from neutral hydrogen has the potential to overtake the cosmic microwave background as our most powerful cosmological probe, because it can map a much larger volume of our Universe, shedding new light on the epoch of reionization, inflation, dark matter, dark energy, and neutrino masses. We report on MITEoR, a pathfinder low-frequency radio interferometer whose goal is to test technologies that greatly reduce the cost of such 3D mapping for a given sensitivity. MITEoR accomplishes this by using massive baseline redundancy both to enable automated precision calibration and to cut the correlator cost scaling from N[superscript 2] to N log N, where N is the number of antennas. The success of MITEoR with its 64 dual-polarization elements bodes well for the more ambitious HERA project, which incorporates many identical or similar technologies using an order of magnitude more antennas, each with dramatically larger collecting area.National Science Foundation (U.S.) (Grant AST-0908848)National Science Foundation (U.S.) (Grant AST-1105835)MIT Kavli Instrumentation FundMassachusetts Institute of Technology. Undergraduate Research Opportunities Progra

    Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis

    Get PDF
    BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London

    Robust estimation of bacterial cell count from optical density

    Get PDF
    Optical density (OD) is widely used to estimate the density of cells in liquid culture, but cannot be compared between instruments without a standardized calibration protocol and is challenging to relate to actual cell count. We address this with an interlaboratory study comparing three simple, low-cost, and highly accessible OD calibration protocols across 244 laboratories, applied to eight strains of constitutive GFP-expressing E. coli. Based on our results, we recommend calibrating OD to estimated cell count using serial dilution of silica microspheres, which produces highly precise calibration (95.5% of residuals &lt;1.2-fold), is easily assessed for quality control, also assesses instrument effective linear range, and can be combined with fluorescence calibration to obtain units of Molecules of Equivalent Fluorescein (MEFL) per cell, allowing direct comparison and data fusion with flow cytometry measurements: in our study, fluorescence per cell measurements showed only a 1.07-fold mean difference between plate reader and flow cytometry data

    Finishing the euchromatic sequence of the human genome

    Get PDF
    The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead

    Non-Standard Errors

    Get PDF
    In statistics, samples are drawn from a population in a data-generating process (DGP). Standard errors measure the uncertainty in estimates of population parameters. In science, evidence is generated to test hypotheses in an evidence-generating process (EGP). We claim that EGP variation across researchers adds uncertainty: Non-standard errors (NSEs). We study NSEs by letting 164 teams test the same hypotheses on the same data. NSEs turn out to be sizable, but smaller for better reproducible or higher rated research. Adding peer-review stages reduces NSEs. We further find that this type of uncertainty is underestimated by participants

    AI is a viable alternative to high throughput screening: a 318-target study

    Get PDF
    : High throughput screening (HTS) is routinely used to identify bioactive small molecules. This requires physical compounds, which limits coverage of accessible chemical space. Computational approaches combined with vast on-demand chemical libraries can access far greater chemical space, provided that the predictive accuracy is sufficient to identify useful molecules. Through the largest and most diverse virtual HTS campaign reported to date, comprising 318 individual projects, we demonstrate that our AtomNet® convolutional neural network successfully finds novel hits across every major therapeutic area and protein class. We address historical limitations of computational screening by demonstrating success for target proteins without known binders, high-quality X-ray crystal structures, or manual cherry-picking of compounds. We show that the molecules selected by the AtomNet® model are novel drug-like scaffolds rather than minor modifications to known bioactive compounds. Our empirical results suggest that computational methods can substantially replace HTS as the first step of small-molecule drug discovery
    corecore